WorldWideScience

Sample records for analysis techniques identifies

  1. Identifying configurations of behavior change techniques in effective medication adherence interventions: a qualitative comparative analysis.

    Science.gov (United States)

    Kahwati, Leila; Viswanathan, Meera; Golin, Carol E; Kane, Heather; Lewis, Megan; Jacobs, Sara

    2016-05-04

    Interventions to improve medication adherence are diverse and complex. Consequently, synthesizing this evidence is challenging. We aimed to extend the results from an existing systematic review of interventions to improve medication adherence by using qualitative comparative analysis (QCA) to identify necessary or sufficient configurations of behavior change techniques among effective interventions. We used data from 60 studies in a completed systematic review to examine the combinations of nine behavior change techniques (increasing knowledge, increasing awareness, changing attitude, increasing self-efficacy, increasing intention formation, increasing action control, facilitation, increasing maintenance support, and motivational interviewing) among studies demonstrating improvements in adherence. Among the 60 studies, 34 demonstrated improved medication adherence. Among effective studies, increasing patient knowledge was a necessary but not sufficient technique. We identified seven configurations of behavior change techniques sufficient for improving adherence, which together accounted for 26 (76 %) of the effective studies. The intervention configuration that included increasing knowledge and self-efficacy was the most empirically relevant, accounting for 17 studies (50 %) and uniquely accounting for 15 (44 %). This analysis extends the completed review findings by identifying multiple combinations of behavior change techniques that improve adherence. Our findings offer direction for policy makers, practitioners, and future comparative effectiveness research on improving adherence.

  2. Nuclear techniques to identify allergenic metals in orthodontic brackets

    International Nuclear Information System (INIS)

    Zenobio, E.G.; Zenobio, M.A.F.; Menezes, M.A.B.C.

    2009-01-01

    The present study determines the elementary alloy composition of ten commercial brands of brackets, especially related to Ni, Cr, and Co metals, confirmed allergenic elements. The nuclear techniques applied in the analyses were X-ray fluorescence (XRF) - Centre National de la Recherche Scientifique, France (National Center of Scientific Research), and X-ray energy spectrometry (XRES), and Instrumental Neutron Activation Analysis (INAA) - CDTN/CNEN, Brazil. The XRES and XRF techniques identified Cr in the 10 samples analyzed and Ni in eight samples. The INAA technique identified the presence of Cr (14% to 19%) and Co (42% to 2400 ppm) in all samples. The semi-quantitative analysis performed by XRF also identified Co in two samples. The techniques were effective in the identification of metals in orthodontic brackets. The elements identified in this study can be considered one of the main reason for the allergic processes among the patients studied. This finding suggests that the patients should be tested for allergy and allergenic sensibility to metals prior to the prescription of orthodontic device. (author)

  3. System reliability analysis using dominant failure modes identified by selective searching technique

    International Nuclear Information System (INIS)

    Kim, Dong-Seok; Ok, Seung-Yong; Song, Junho; Koh, Hyun-Moo

    2013-01-01

    The failure of a redundant structural system is often described by innumerable system failure modes such as combinations or sequences of local failures. An efficient approach is proposed to identify dominant failure modes in the space of random variables, and then perform system reliability analysis to compute the system failure probability. To identify dominant failure modes in the decreasing order of their contributions to the system failure probability, a new simulation-based selective searching technique is developed using a genetic algorithm. The system failure probability is computed by a multi-scale matrix-based system reliability (MSR) method. Lower-scale MSR analyses evaluate the probabilities of the identified failure modes and their statistical dependence. A higher-scale MSR analysis evaluates the system failure probability based on the results of the lower-scale analyses. Three illustrative examples demonstrate the efficiency and accuracy of the approach through comparison with existing methods and Monte Carlo simulations. The results show that the proposed method skillfully identifies the dominant failure modes, including those neglected by existing approaches. The multi-scale MSR method accurately evaluates the system failure probability with statistical dependence fully considered. The decoupling between the failure mode identification and the system reliability evaluation allows for effective applications to larger structural systems

  4. Identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis by using the Delphi Technique

    Science.gov (United States)

    Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Ng, E. G.

    2018-02-01

    This paper explains the process carried out in identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the technical relevance standpoint. Three statements describing the relevant features of NDCDB for spatial analysis were established after three rounds of consensus building. It highlighted the NDCDB’s characteristics such as its spatial accuracy, functions, and criteria as a facilitating tool for spatial analysis. By recognising the relevant features of NDCDB for spatial analysis in this study, practical application of NDCDB for various analysis and purpose can be widely implemented.

  5. A Visual Analytics Technique for Identifying Heat Spots in Transportation Networks

    Directory of Open Access Journals (Sweden)

    Marian Sorin Nistor

    2016-12-01

    Full Text Available The decision takers of the public transportation system, as part of urban critical infrastructures, need to increase the system resilience. For doing so, we identified analysis tools for biological networks as an adequate basis for visual analytics in that domain. In the paper at hand we therefore translate such methods for transportation systems and show the benefits by applying them on the Munich subway network. Here, visual analytics is used to identify vulnerable stations from different perspectives. The applied technique is presented step by step. Furthermore, the key challenges in applying this technique on transportation systems are identified. Finally, we propose the implementation of the presented features in a management cockpit to integrate the visual analytics mantra for an adequate decision support on transportation systems.

  6. A technique to identify some typical radio frequency interference using support vector machine

    Science.gov (United States)

    Wang, Yuanchao; Li, Mingtao; Li, Dawei; Zheng, Jianhua

    2017-07-01

    In this paper, we present a technique to automatically identify some typical radio frequency interference from pulsar surveys using support vector machine. The technique has been tested by candidates. In these experiments, to get features of SVM, we use principal component analysis for mosaic plots and its classification accuracy is 96.9%; while we use mathematical morphology operation for smog plots and horizontal stripes plots and its classification accuracy is 86%. The technique is simple, high accurate and useful.

  7. MALDI-TOF and SELDI-TOF analysis: “tandem” techniques to identify potential biomarker in fibromyalgia

    Directory of Open Access Journals (Sweden)

    A. Lucacchini

    2011-11-01

    Full Text Available Fibromyalgia (FM is characterized by the presence of chronic widespread pain throughout the musculoskeletal system and diffuse tenderness. Unfortunately, no laboratory tests have been appropriately validated for FM and correlated with the subsets and activity. The aim of this study was to apply a proteomic technique in saliva of FM patients: the Surface Enhance Laser Desorption/Ionization Time-of-Flight (SELDI-TOF. For this study, 57 FM patients and 35 HC patients were enrolled. The proteomic analysis of saliva was carried out using SELDI-TOF. The analysis was performed using different chip arrays with different characteristics of binding. The statistical analysis was performed using cluster analysis and the difference between two groups was underlined using Student’s t-test. Spectra analysis highlighted the presence of several peaks differently expressed in FM patients compared with controls. The preliminary results obtained by SELDI-TOF analysis were compared with those obtained in our previous study performed on whole saliva of FM patients by using electrophoresis. The m/z of two peaks, increased in FM patients, seem to overlap well with the molecular weight of calgranulin A and C and Rho GDP-dissociation inhibitor 2, which we had found up-regulated in our previous study. These preliminary results showed the possibility of identifying potential salivary biomarker through salivary proteomic analysis with MALDI-TOF and SELDI-TOF in FM patients. The peaks observed allow us to focus on some of the particular pathogenic aspects of FM, the oxidative stress which contradistinguishes this condition, the involvement of proteins related to the cytoskeletal arrangements, and central sensibilization.

  8. Transverse vibration technique to identify deteriorated wood floor systems

    Science.gov (United States)

    R.J. Ross; X. Wang; M.O. Hunt; L.A. Soltis

    2002-01-01

    The Forest Products Laboratory, USDA Forest Service, has been developing nondestructive evaluation (NDE) techniques to identify degradation of wood in structures and the performance characteristics that remain in the structure. This work has focused on using dynamic testing techniques, particularly stress wave and ultrasonic transmission NDE techniques for both...

  9. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  10. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    Science.gov (United States)

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style

  11. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  12. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  13. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  14. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  15. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  16. Social Learning Network Analysis Model to Identify Learning Patterns Using Ontology Clustering Techniques and Meaningful Learning

    Science.gov (United States)

    Firdausiah Mansur, Andi Besse; Yusof, Norazah

    2013-01-01

    Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…

  17. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  18. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  19. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  20. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  1. Use of Photogrammetry and Biomechanical Gait analysis to Identify Individuals

    DEFF Research Database (Denmark)

    Larsen, Peter Kastmand; Simonsen, Erik Bruun; Lynnerup, Niels

    Photogrammetry and recognition of gait patterns are valuable tools to help identify perpetrators based on surveillance recordings. We have found that stature but only few other measures have a satisfying reproducibility for use in forensics. Several gait variables with high recognition rates were...... found. Especially the variables located in the frontal plane are interesting due to large inter-individual differences in time course patterns. The variables with high recognition rates seem preferable for use in forensic gait analysis and as input variables to waveform analysis techniques...

  2. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  3. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    Science.gov (United States)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  4. Performance analysis of clustering techniques over microarray data: A case study

    Science.gov (United States)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  5. Application of gene network analysis techniques identifies AXIN1/PDIA2 and endoglin haplotypes associated with bicuspid aortic valve.

    Directory of Open Access Journals (Sweden)

    Eric C Wooten

    2010-01-01

    Full Text Available Bicuspid Aortic Valve (BAV is a highly heritable congenital heart defect. The low frequency of BAV (1% of general population limits our ability to perform genome-wide association studies. We present the application of four a priori SNP selection techniques, reducing the multiple-testing penalty by restricting analysis to SNPs relevant to BAV in a genome-wide SNP dataset from a cohort of 68 BAV probands and 830 control subjects. Two knowledge-based approaches, CANDID and STRING, were used to systematically identify BAV genes, and their SNPs, from the published literature, microarray expression studies and a genome scan. We additionally tested Functionally Interpolating SNPs (fitSNPs present on the array; the fourth consisted of SNPs selected by Random Forests, a machine learning approach. These approaches reduced the multiple testing penalty by lowering the fraction of the genome probed to 0.19% of the total, while increasing the likelihood of studying SNPs within relevant BAV genes and pathways. Three loci were identified by CANDID, STRING, and fitSNPS. A haplotype within the AXIN1-PDIA2 locus (p-value of 2.926x10(-06 and a haplotype within the Endoglin gene (p-value of 5.881x10(-04 were found to be strongly associated with BAV. The Random Forests approach identified a SNP on chromosome 3 in association with BAV (p-value 5.061x10(-06. The results presented here support an important role for genetic variants in BAV and provide support for additional studies in well-powered cohorts. Further, these studies demonstrate that leveraging existing expression and genomic data in the context of GWAS studies can identify biologically relevant genes and pathways associated with a congenital heart defect.

  6. Three novel approaches to structural identifiability analysis in mixed-effects models.

    Science.gov (United States)

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not

  7. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  8. Identifying irradiated flour by photo-stimulated luminescence technique

    International Nuclear Information System (INIS)

    Ros Anita Ahmad Ramli; Muhammad Samudi Yasir; Zainon Othman; Wan Saffiey Wan Abdullah

    2013-01-01

    Full-text: The photo-stimulated luminescence technique is recommended by European Committee for standardization for the detection food irradiation (EN 13751:2009). This study shows on luminescence technique to identify gamma irradiated five types of flour (corn flour, tapioca flour, wheat flour, glutinos rice flour and rice flour) at three difference dose levels in the range 0.2 - 1 kGy. The signal level is compare with two thresholds (700 and 5000). The majority of irradiated samples produce a strong signal above the upper threshold (5000 counts/ 60 s). All the control samples gave negative screening result while the signals below the lower threshold (700 counts/ 60s) suggest that the sample has not been irradiated. A few samples show the signal levels between the two thresholds (intermediate signals) suggest that further investigation. Reported procedure was also tested over 60 days, confirming the applicability and feasibility of proposed methods. (author)

  9. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  10. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  11. The application of two recently developed human reliability techniques to cognitive error analysis

    International Nuclear Information System (INIS)

    Gall, W.

    1990-01-01

    Cognitive error can lead to catastrophic consequences for manned systems, including those whose design renders them immune to the effects of physical slips made by operators. Four such events, pressurized water and boiling water reactor accidents which occurred recently, were analysed. The analysis identifies the factors which contributed to the errors and suggests practical strategies for error recovery or prevention. Two types of analysis were conducted: an unstructured analysis based on the analyst's knowledge of psychological theory, and a structured analysis using two recently-developed human reliability analysis techniques. In general, the structured techniques required less effort to produce results and these were comparable to those of the unstructured analysis. (author)

  12. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  13. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  14. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  15. Effective self-regulation change techniques to promote mental wellbeing among adolescents: a meta-analysis

    NARCIS (Netherlands)

    Genugten, L. van; Dusseldorp, E.; Massey, E.K.; Empelen, P. van

    2017-01-01

    Mental wellbeing is influenced by self-regulation processes. However, little is known on the efficacy of change techniques based on self-regulation to promote mental wellbeing. The aim of this meta-analysis is to identify effective self-regulation techniques (SRTs) in primary and secondary

  16. Motor current and leakage flux signature analysis technique for condition monitoring

    International Nuclear Information System (INIS)

    Pillai, M.V.; Moorthy, R.I.K.; Mahajan, S.C.

    1994-01-01

    Till recently analysis of vibration signals was the only means available to predict the state of health of plant equipment. Motor current and leakage magnetic flux signature analysis is acquiring importance as a technique for detection of incipient damages in the electrical machines and as a supplementary technique for diagnostics of driven equipment such as centrifugal and reciprocating pumps. The state of health of the driven equipment is assessed by analysing time signal, frequency spectrum and trend analysis. For example, the pump vane frequency, piston stroke frequency, gear frequency and bearing frequencies are indicated in the current and flux spectra. By maintaining a periodic record of the amplitudes of various frequency lines in the frequency spectra, it is possible to understand the trend of deterioration of parts and components of the pump. All problems arising out of inappropriate mechanical alignment of vertical pumps are easily identified by a combined analysis of current, flux and vibration signals. It is found that current signature analysis technique is a sufficient method in itself for the analysis of state of health of reciprocating pumps and compressors. (author). 10 refs., 4 figs

  17. Identifying the sources of produced water in the oil field by isotopic techniques

    International Nuclear Information System (INIS)

    Nguyen Minh Quy; Hoang Long; Le Thi Thu Huong; Luong Van Huan; Vo Thi Tuong Hanh

    2014-01-01

    The objective of this study is to identify the sources of the formation water in the Southwest Su-Tu-Den (STD SW) basement reservoir. To achieve the objective, isotopic techniques along with geochemical analysis for chloride, bromide, strontium dissolved in the water were applied. The isotopic techniques used in this study were the determination of water stable isotopes signatures (δ 2 H and (δ 18 O) and of the 87 Sr/ 86 Sr ratio of strontium in rock cutting sample and that dissolved in the formation water. The obtained results showed that the stable isotopes compositions of water in the Lower Miocene was -3‰ and -23‰ for (δ 18 O and (δ 2 H, respectively indicating the primeval nature of seawater in the reservoir. Meanwhile, the isotopic composition of water in the basement was clustered in a range of alternated freshwater with (δ 18 O and (δ 2 H being -(3-4)‰ and -(54-60)‰, respectively). The strontium isotopes ratio for water in the Lower Miocene reservoir was lower compared to that for water in the basement confirming the different natures of the water in the two reservoirs. The obtained results are assured for the techniques applicability, and it is recommended that studies on identification of the flow-path of the formation water in the STD SW basement reservoir should be continued. (author)

  18. Identifying the "Right Stuff": An Exploration-Focused Astronaut Job Analysis

    Science.gov (United States)

    Barrett, J. D.; Holland, A. W.; Vessey, W. B.

    2015-01-01

    Industrial and organizational (I/O) psychologists play a key role in NASA astronaut candidate selection through the identification of the competencies necessary to successfully engage in the astronaut job. A set of psychosocial competencies, developed by I/O psychologists during a prior job analysis conducted in 1996 and updated in 2003, were identified as necessary for individuals working and living in the space shuttle and on the International Space Station (ISS). This set of competencies applied to the space shuttle and applies to current ISS missions, but may not apply to longer-duration or long-distance exploration missions. With the 2015 launch of the first 12- month ISS mission and the shift in the 2020s to missions beyond low earth orbit, the type of missions that astronauts will conduct and the environment in which they do their work will change dramatically, leading to new challenges for these crews. To support future astronaut selection, training, and research, I/O psychologists in NASA's Behavioral Health and Performance (BHP) Operations and Research groups engaged in a joint effort to conduct an updated analysis of the astronaut job for current and future operations. This project will result in the identification of behavioral competencies critical to performing the astronaut job, along with relative weights for each of the identified competencies, through the application of job analysis techniques. While this job analysis is being conducted according to job analysis best practices, the project poses a number of novel challenges. These challenges include the need to identify competencies for multiple mission types simultaneously, to evaluate jobs that have no incumbents as they have never before been conducted, and working with a very limited population of subject matter experts. Given these challenges, under the guidance of job analysis experts, we used the following methods to conduct the job analysis and identify the key competencies for current and

  19. Adhesive polypeptides of Staphylococcus aureus identified using a novel secretion library technique in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Holm Liisa

    2011-05-01

    Full Text Available Abstract Background Bacterial adhesive proteins, called adhesins, are frequently the decisive factor in initiation of a bacterial infection. Characterization of such molecules is crucial for the understanding of bacterial pathogenesis, design of vaccines and development of antibacterial drugs. Because adhesins are frequently difficult to express, their characterization has often been hampered. Alternative expression methods developed for the analysis of adhesins, e.g. surface display techniques, suffer from various drawbacks and reports on high-level extracellular secretion of heterologous proteins in Gram-negative bacteria are scarce. These expression techniques are currently a field of active research. The purpose of the current study was to construct a convenient, new technique for identification of unknown bacterial adhesive polypeptides directly from the growth medium of the Escherichia coli host and to identify novel proteinaceous adhesins of the model organism Staphylococcus aureus. Results Randomly fragmented chromosomal DNA of S. aureus was cloned into a unique restriction site of our expression vector, which facilitates secretion of foreign FLAG-tagged polypeptides into the growth medium of E. coli ΔfliCΔfliD, to generate a library of 1663 clones expressing FLAG-tagged polypeptides. Sequence and bioinformatics analyses showed that in our example, the library covered approximately 32% of the S. aureus proteome. Polypeptides from the growth medium of the library clones were screened for binding to a selection of S. aureus target molecules and adhesive fragments of known staphylococcal adhesins (e.g coagulase and fibronectin-binding protein A as well as polypeptides of novel function (e.g. a universal stress protein and phosphoribosylamino-imidazole carboxylase ATPase subunit were detected. The results were further validated using purified His-tagged recombinant proteins of the corresponding fragments in enzyme-linked immunoassay and

  20. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  1. An automated technique to identify potential inappropriate traditional Chinese medicine (TCM) prescriptions.

    Science.gov (United States)

    Yang, Hsuan-Chia; Iqbal, Usman; Nguyen, Phung Anh; Lin, Shen-Hsien; Huang, Chih-Wei; Jian, Wen-Shan; Li, Yu-Chuan

    2016-04-01

    Medication errors such as potential inappropriate prescriptions would induce serious adverse drug events to patients. Information technology has the ability to prevent medication errors; however, the pharmacology of traditional Chinese medicine (TCM) is not as clear as in western medicine. The aim of this study was to apply the appropriateness of prescription (AOP) model to identify potential inappropriate TCM prescriptions. We used the association rule of mining techniques to analyze 14.5 million prescriptions from the Taiwan National Health Insurance Research Database. The disease and TCM (DTCM) and traditional Chinese medicine-traditional Chinese medicine (TCMM) associations are computed by their co-occurrence, and the associations' strength was measured as Q-values, which often referred to as interestingness or life values. By considering the number of Q-values, the AOP model was applied to identify the inappropriate prescriptions. Afterwards, three traditional Chinese physicians evaluated 1920 prescriptions and validated the detected outcomes from the AOP model. Out of 1920 prescriptions, 97.1% of positive predictive value and 19.5% of negative predictive value were shown by the system as compared with those by experts. The sensitivity analysis indicated that the negative predictive value could improve up to 27.5% when the model's threshold changed to 0.4. We successfully applied the AOP model to automatically identify potential inappropriate TCM prescriptions. This model could be a potential TCM clinical decision support system in order to improve drug safety and quality of care. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Using Data-Driven and Process Mining Techniques for Identifying and Characterizing Problem Gamblers in New Zealand

    Directory of Open Access Journals (Sweden)

    Suriadi Suriadi

    2016-12-01

    Full Text Available This article uses data-driven techniques combined with established theory in order to analyse gambling behavioural patterns of 91 thousand individuals on a real-world fixed-odds gambling dataset in New Zealand. This research uniquely integrates a mixture of process mining, data mining and confirmatory statistical techniques in order to categorise different sub-groups of gamblers, with the explicit motivation of identifying problem gambling behaviours and reporting on the challenges and lessons learned from our case study.We demonstrate how techniques from various disciplines can be combined in order to gain insight into the behavioural patterns exhibited by different types of gamblers, as well as provide assurances of the correctness of our approach and findings. A highlight of this case study is both the methodology which demonstrates how such a combination of techniques provides a rich set of effective tools to undertake an exploratory and open-ended data analysis project that is guided by the process cube concept, as well as the findings themselves which indicate that the contribution that problem gamblers make to the total volume, expenditure, and revenue is higher than previous studies have maintained.

  3. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis

    Directory of Open Access Journals (Sweden)

    Akira Ishikawa

    2017-11-01

    Full Text Available Large numbers of quantitative trait loci (QTL affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  4. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    Science.gov (United States)

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  5. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  6. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  7. Identifying subgroups of patients using latent class analysis

    DEFF Research Database (Denmark)

    Nielsen, Anne Mølgaard; Kent, Peter; Hestbæk, Lise

    2017-01-01

    BACKGROUND: Heterogeneity in patients with low back pain (LBP) is well recognised and different approaches to subgrouping have been proposed. Latent Class Analysis (LCA) is a statistical technique that is increasingly being used to identify subgroups based on patient characteristics. However......, as LBP is a complex multi-domain condition, the optimal approach when using LCA is unknown. Therefore, this paper describes the exploration of two approaches to LCA that may help improve the identification of clinically relevant and interpretable LBP subgroups. METHODS: From 928 LBP patients consulting...... of statistical performance measures, qualitative evaluation of clinical interpretability (face validity) and a subgroup membership comparison. RESULTS: For the single-stage LCA, a model solution with seven patient subgroups was preferred, and for the two-stage LCA, a nine patient subgroup model. Both approaches...

  8. Testing and evaluation of existing techniques for identifying uptakes and measuring retention of uranium in mill workers

    International Nuclear Information System (INIS)

    1983-03-01

    Preliminary tests and evaluations of existing bio-analytical techniques for identifying uptakes and measuring retention of uranium in mill workers were made at two uranium mills. Urinalysis tests were found to be more reliable indicators of uranium uptakes than personal air sampling. Static air samples were not found to be good indicators of personal uptakes. In vivo measurements of uranium in lung were successfully carried out in the presence of high and fluctuating background radiation. Interference from external contamination was common during end of shift measurements. A full scale study to evaluate model parameters for the uptake, retention and elimination of uranium should include, in addition to the above techniques, particle size determination of airborne uranium, solubility in simulated lung fluid, uranium analysis in faeces and bone and minute volume measurements for each subject

  9. Kinematic and kinetic analysis of overhand, sidearm and underhand lacrosse shot techniques.

    Science.gov (United States)

    Macaulay, Charles A J; Katz, Larry; Stergiou, Pro; Stefanyshyn, Darren; Tomaghelli, Luciano

    2017-12-01

    Lacrosse requires the coordinated performance of many complex skills. One of these skills is shooting on the opponents' net using one of three techniques: overhand, sidearm or underhand. The purpose of this study was to (i) determine which technique generated the highest ball velocity and greatest shot accuracy and (ii) identify kinematic and kinetic variables that contribute to a high velocity and high accuracy shot. Twelve elite male lacrosse players participated in this study. Kinematic data were sampled at 250 Hz, while two-dimensional force plates collected ground reaction force data (1000 Hz). Statistical analysis showed significantly greater ball velocity for the sidearm technique than overhand (P  0.05). Kinematic and kinetic variables were not significantly correlated to shot accuracy or velocity across all shot types; however, when analysed independently, the lead foot horizontal impulse showed a negative correlation with underhand ball velocity (P = 0.042). This study identifies the technique with the highest ball velocity, defines kinematic and kinetic predictors related to ball velocity and provides information to coaches and athletes concerned with improving lacrosse shot performance.

  10. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  11. Analysis of hairy root culture of Rauvolfia serpentina using direct analysis in real time mass spectrometric technique.

    Science.gov (United States)

    Madhusudanan, K P; Banerjee, Suchitra; Khanuja, Suman P S; Chattopadhyay, Sunil K

    2008-06-01

    The applicability of a new mass spectrometric technique, DART (direct analysis in real time) has been studied in the analysis of the hairy root culture of Rauvolfia serpentina. The intact hairy roots were analyzed by holding them in the gap between the DART source and the mass spectrometer for measurements. Two nitrogen-containing compounds, vomilenine and reserpine, were characterized from the analysis of the hairy roots almost instantaneously. The confirmation of the structures of the identified compounds was made through their accurate molecular formula determinations. This is the first report of the application of DART technique for the characterization of compounds that are expressed in the hairy root cultures of Rauvolfia serpentina. Moreover, this also constitutes the first report of expression of reserpine in the hairy root culture of Rauvolfia serpentina. Copyright (c) 2008 John Wiley & Sons, Ltd.

  12. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  13. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  14. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  15. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  16. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  17. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  18. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  19. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  20. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  1. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  2. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  3. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  4. Messina: a novel analysis tool to identify biologically relevant molecules in disease.

    Directory of Open Access Journals (Sweden)

    Mark Pinese

    Full Text Available BACKGROUND: Morphologically similar cancers display heterogeneous patterns of molecular aberrations and follow substantially different clinical courses. This diversity has become the basis for the definition of molecular phenotypes, with significant implications for therapy. Microarray or proteomic expression profiling is conventionally employed to identify disease-associated genes, however, traditional approaches for the analysis of profiling experiments may miss molecular aberrations which define biologically relevant subtypes. METHODOLOGY/PRINCIPAL FINDINGS: Here we present Messina, a method that can identify those genes that only sometimes show aberrant expression in cancer. We demonstrate with simulated data that Messina is highly sensitive and specific when used to identify genes which are aberrantly expressed in only a proportion of cancers, and compare Messina to contemporary analysis techniques. We illustrate Messina by using it to detect the aberrant expression of a gene that may play an important role in pancreatic cancer. CONCLUSIONS/SIGNIFICANCE: Messina allows the detection of genes with profiles typical of markers of molecular subtype, and complements existing methods to assist the identification of such markers. Messina is applicable to any global expression profiling data, and to allow its easy application has been packaged into a freely-available stand-alone software package.

  5. Rate transient analysis for homogeneous and heterogeneous gas reservoirs using the TDS technique

    International Nuclear Information System (INIS)

    Escobar, Freddy Humberto; Sanchez, Jairo Andres; Cantillo, Jose Humberto

    2008-01-01

    In this study pressure test analysis in wells flowing under constant wellbore flowing pressure for homogeneous and naturally fractured gas reservoir using the TDS technique is introduced. Although, constant rate production is assumed in the development of the conventional well test analysis methods, constant pressure production conditions are sometimes used in the oil and gas industry. The constant pressure technique or rate transient analysis is more popular reckoned as decline curve analysis under which rate is allows to decline instead of wellbore pressure. The TDS technique, everyday more used even in the most recognized software packages although without using its trade brand name, uses the log-log plot to analyze pressure and pressure derivative test data to identify unique features from which exact analytical expression are derived to easily estimate reservoir and well parameters. For this case, the fingerprint characteristics from the log-log plot of the reciprocal rate and reciprocal rate derivative were employed to obtain the analytical expressions used for the interpretation analysis. Many simulation experiments demonstrate the accuracy of the new method. Synthetic examples are shown to verify the effectiveness of the proposed methodology

  6. New technique of identifying the hierarchy of dynamic domains in proteins using a method of molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Yesylevskyy S. O.

    2010-04-01

    Full Text Available Aim. Despite a large number of existing domain identification techniques there is no universally accepted method, which identifies the hierarchy of dynamic domains using the data of molecular dynamics (MD simulations. The goal of this work is to develop such technique. Methods. The dynamic domains are identified by eliminating systematic motions from MD trajectories recursively in a model-free manner. Results. The technique called the Hierarchical Domain-Wise Alignment (HDWA to identify hierarchically organized dynamic domains in proteins using the MD trajectories has been developed. Conclusion. A new method of domain identification in proteins is proposed

  7. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  8. Using Quantitative Data Analysis Techniques for Bankruptcy Risk Estimation for Corporations

    Directory of Open Access Journals (Sweden)

    Ştefan Daniel ARMEANU

    2012-01-01

    Full Text Available Diversification of methods and techniques for quantification and management of risk has led to the development of many mathematical models, a large part of which focused on measuring bankruptcy risk for businesses. In financial analysis there are many indicators which can be used to assess the risk of bankruptcy of enterprises but to make an assessment it is needed to reduce the number of indicators and this can be achieved through principal component, cluster and discriminant analyses techniques. In this context, the article aims to build a scoring function used to identify bankrupt companies, using a sample of companies listed on Bucharest Stock Exchange.

  9. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  10. Surface Coating Technique of Northern Black Polished Ware by the Microscopic Analysis

    Directory of Open Access Journals (Sweden)

    Dilruba Sharmin

    2012-12-01

    Full Text Available An organic substance has been identified in the top layer of Northern Black Polished Ware (NBPW excavated from the Wari-Boteshwar and Mahasthangarh sites in Bangladesh. NBPW is the most distinctive ceramic of Early Historic period and the technique of its surface gloss acquired numerous theories. This particular paper is an analytical study of collected NBPW sherds from these two sites including surface observations using binocular and scanning electron microscopes and Thin Section Analysis of potsherds. Thin section analysis identified two different layers of coating on the surface of the NBPW. One layer is a ‘slip’ (ground coat and the other is a ‘top layer or top coat ’. The slip was made from refined clay and the top layer was derived from organic substance. Microscopic analysis confirmed the solid and non-clayey characteristics of the top coat.

  11. Identifying organizational deficiencies through root-cause analysis

    International Nuclear Information System (INIS)

    Tuli, R.W.; Apostolakis, G.E.

    1996-01-01

    All nuclear power plants incorporate root-cause analysis as an instrument to help identify and isolate key factors judged to be of significance following an incident or accident. Identifying the principal deficiencies can become very difficult when the event involves not only human and machine interaction, but possibly the underlying safety and quality culture of the organization. The current state of root-cause analysis is to conclude the investigation after identifying human and/or hardware failures. In this work, root-cause analysis is taken one step further by examining plant work processes and organizational factors. This extension is considered significant to the success of the analysis, especially when management deficiency is believed to contribute to the incident. The results of root-cause analysis can be most effectively implemented if the organization, as a whole, wishes to improve the overall operation of the plant by preventing similar incidents from occurring again. The study adds to the existing root-cause analysis the ability to localize the causes of undesirable events and to focus on those problems hidden deeply within the work processes that are routinely followed in the operation and maintenance of the facility

  12. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  13. Using multidimensional topological data analysis to identify traits of hip osteoarthritis.

    Science.gov (United States)

    Rossi-deVries, Jasmine; Pedoia, Valentina; Samaan, Michael A; Ferguson, Adam R; Souza, Richard B; Majumdar, Sharmila

    2018-05-07

    Osteoarthritis (OA) is a multifaceted disease with many variables affecting diagnosis and progression. Topological data analysis (TDA) is a state-of-the-art big data analytics tool that can combine all variables into multidimensional space. TDA is used to simultaneously analyze imaging and gait analysis techniques. To identify biochemical and biomechanical biomarkers able to classify different disease progression phenotypes in subjects with and without radiographic signs of hip OA. Longitudinal study for comparison of progressive and nonprogressive subjects. In all, 102 subjects with and without radiographic signs of hip osteoarthritis. 3T, SPGR 3D MAPSS T 1ρ /T 2 , intermediate-weighted fat-suppressed fast spin-echo (FSE). Multidimensional data analysis including cartilage composition, bone shape, Kellgren-Lawrence (KL) classification of osteoarthritis, scoring hip osteoarthritis with MRI (SHOMRI), hip disability and osteoarthritis outcome score (HOOS). Analysis done using TDA, Kolmogorov-Smirnov (KS) testing, and Benjamini-Hochberg to rank P-value results to correct for multiple comparisons. Subjects in the later stages of the disease had an increased SHOMRI score (P Analysis of this subgroup identified knee biomechanics (P analysis of an OA subgroup with femoroacetabular impingement (FAI) showed anterior labral tears to be the most significant marker (P = 0.0017) between those FAI subjects with and without OA symptoms. The data-driven analysis obtained with TDA proposes new phenotypes of these subjects that partially overlap with the radiographic-based classical disease status classification and also shows the potential for further examination of an early onset biomechanical intervention. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.

  14. Identifying fly puparia by clearing technique: application to forensic entomology.

    Science.gov (United States)

    Sukontason, Kabkaew L; Ngern-Klun, Radchadawan; Sripakdee, Duanghatai; Sukontason, Kom

    2007-10-01

    In forensic investigations, immature stages of the fly (egg, larva, or puparia) can be used as entomological evidence at death scenes, not only to estimate the postmortem interval (PMI), analyze toxic substances, and to determine the manner of death but also to indicate the movement of a corpse in homicide cases. Of these immature stages, puparia represent the longest developmental time, which makes them of useful. However, in order for forensic entomologists to use puparia effectively, it is crucial that they are able to accurately identify the species of fly found in a corpse. Typically, these puparia are similar in general appearance, being coarctate and light brown to dark brown in color, which makes identification difficult. In this study, we report on the clearing technique used to pale the integument of fly puparia, thereby allowing observation of the anterior end (second to fourth segments) and the profile of the posterior spiracle, which are important clues for identification. We used puparia of the blowfly, Chrysomya megacephala (F.), as the model species in this experiment. With placement in a 20% potassium hydroxide solution daily and mounting on a clearing medium (Permount(R), New Jersey), the profile of the posterior spiracle could be clearly examined under a light microscope beginning on the fifth day after pupation, and the number of papillae in the anterior spiracle could be counted easily starting from the ninth day. Comparison of morphological features of C. megacephala puparia with those of other blowflies (Chrysomya nigripes [Aubertin], Chrysomya rufifacies [Macquart], Chrysomya villeneuvi [Patton], Lucilia cuprina [Wiedemann], and Hemipyrellia ligurriens [Wiedemann]) and a housefly (Musca domestica L.) revealed that the anterior ends and the profiles of the posterior spiracles had markedly distinguishing characteristics. Morphometric analysis of the length and width of puparia, along with the length of the gaps between the posterior spiracles

  15. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    Energy Technology Data Exchange (ETDEWEB)

    Dahing, Lahasen Normanshah [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor, Malaysia and Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia); Yahya, Redzuan [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor (Malaysia); Yahya, Roslan; Hassan, Hearie [Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia)

    2014-09-03

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm{sup 3} and 15×15×15 cm{sup 3} were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed.

  16. Identifying Innovative Interventions to Promote Healthy Eating Using Consumption-Oriented Food Supply Chain Analysis

    Science.gov (United States)

    Hawkes, Corinna

    2009-01-01

    The mapping and analysis of supply chains is a technique increasingly used to address problems in the food system. Yet such supply chain management has not yet been applied as a means of encouraging healthier diets. Moreover, most policies recommended to promote healthy eating focus on the consumer end of the chain. This article proposes a consumption-oriented food supply chain analysis to identify the changes needed in the food supply chain to create a healthier food environment, measured in terms of food availability, prices, and marketing. Along with established forms of supply chain analysis, the method is informed by a historical overview of how food supply chains have changed over time. The method posits that the actors and actions in the chain are affected by organizational, financial, technological, and policy incentives and disincentives, which can in turn be levered for change. It presents a preliminary example of the supply of Coca-Cola beverages into school vending machines and identifies further potential applications. These include fruit and vegetable supply chains, local food chains, supply chains for health-promoting versions of food products, and identifying financial incentives in supply chains for healthier eating. PMID:23144674

  17. Identifying Innovative Interventions to Promote Healthy Eating Using Consumption-Oriented Food Supply Chain Analysis.

    Science.gov (United States)

    Hawkes, Corinna

    2009-07-01

    The mapping and analysis of supply chains is a technique increasingly used to address problems in the food system. Yet such supply chain management has not yet been applied as a means of encouraging healthier diets. Moreover, most policies recommended to promote healthy eating focus on the consumer end of the chain. This article proposes a consumption-oriented food supply chain analysis to identify the changes needed in the food supply chain to create a healthier food environment, measured in terms of food availability, prices, and marketing. Along with established forms of supply chain analysis, the method is informed by a historical overview of how food supply chains have changed over time. The method posits that the actors and actions in the chain are affected by organizational, financial, technological, and policy incentives and disincentives, which can in turn be levered for change. It presents a preliminary example of the supply of Coca-Cola beverages into school vending machines and identifies further potential applications. These include fruit and vegetable supply chains, local food chains, supply chains for health-promoting versions of food products, and identifying financial incentives in supply chains for healthier eating.

  18. Use of nuclear techniques for coal analysis in exploration, mining and processing

    International Nuclear Information System (INIS)

    Clayton, C.G.; Wormald, M.R.

    1982-01-01

    Nuclear techniques have a long history of application in the coal industry, during exploration and especially during coal preparation, for the measurement of ash content. The preferred techniques are based on X- and gamma-ray scattering and borehole logging, and on-line equipment incorporating these techniques are now in world-wide routine use. However, gamma-ray techniques are mainly restricted to density measurement and X-ray techniques are principally used for ash determinations. They have a limited range and when used on-line some size reduction of the coal is usually required and a full elemental analysis is not possible. In particular, X- and gamma-ray techniques are insensitive to the principal elements in the combustible component and to many of the important elements in the mineral fraction. Neutron techniques on the other hand have a range which is compatible with on-line requirements and all elements in the combustible component and virtually all elements in the mineral component can be observed. A complete elemental analysis of coal then allows the ash content and the calorific value to be determined on-line. This paper surveys the various nuclear techniques now in use and gives particular attention to the present state of development of neutron methods and to their advantages and limitations. Although it is shown that considerable further development and operational experience are still required, equipment now being introduced has a performance which matches many of the identified requirements and an early improvement in specification can be anticipated

  19. Intelligent Techniques Using Molecular Data Analysis in Leukaemia: An Opportunity for Personalized Medicine Support System.

    Science.gov (United States)

    Banjar, Haneen; Adelson, David; Brown, Fred; Chaudhri, Naeem

    2017-01-01

    The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice.

  20. A Roadmap of Risk Diagnostic Methods: Developing an Integrated View of Risk Identification and Analysis Techniques

    National Research Council Canada - National Science Library

    Williams, Ray; Ambrose, Kate; Bentrem, Laura

    2004-01-01

    ...), which is envisioned to be a comprehensive reference tool for risk identification and analysis (RI AND A) techniques. Program Managers (PMs) responsible for developing or acquiring software-intensive systems typically identify risks in different ways...

  1. Testing the potential of geochemical techniques in identifying hydrological systems within landslides in partly weathered marls

    Science.gov (United States)

    Bogaard, T. A.

    2003-04-01

    This paper’s objectives are twofold: to test the potential of cation exchange capacity (CEC) analysis for refinement of the knowledge of the hydrological system in landslide areas; and to examine two laboratory CEC analysis techniques on their applicability to partly weathered marls. The NH4Ac and NaCl laboratory techniques are tested. The geochemical results are compared with the core descriptions and interpreted with respect to their usefulness. Both analysis techniques give identical results for CEC, and are plausible on the basis of the available clay content information. The determination of the exchangeable cations was more difficult, since part of the marls dissolved. With the ammonium-acetate method more of the marls are dissolved than with the sodium-chloride method. This negatively affects the results of the exchangeable cations. Therefore, the NaCl method is to be preferred for the determination of the cation fractions at the complex, be it that this method has the disadvantage that the sodium fraction cannot be determined. To overcome this problem it is recommended to try and use another salt e.g. SrCl2 as displacement fluid. Both Alvera and Boulc-Mondorès examples show transitions in cation composition with depth. It was shown that the exchangeable cation fractions can be useful in locating boundaries between water types, especially the boundary between the superficial, rain fed hydrological system and the lower, regional ground water system. This information may be important for landslide interventions since the hydrological system and the origin of the water need to be known in detail. It is also plausible that long-term predictions of slope stability may be improved by knowledge of the hydrogeochemical evolution of clayey landslides. In the Boulc-Mondorès example the subsurface information that can be extracted from CEC analyses was presented. In the Boulc-Mondorès cores deviant intervals of CEC could be identified. These are interpreted as

  2. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  3. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    Science.gov (United States)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  4. Demonstration of statistical approaches to identify component's ageing by operational data analysis-A case study for the ageing PSA network

    International Nuclear Information System (INIS)

    Rodionov, Andrei; Atwood, Corwin L.; Kirchsteiger, Christian; Patrik, Milan

    2008-01-01

    The paper presents some results of a case study on 'Demonstration of statistical approaches to identify the component's ageing by operational data analysis', which was done in the frame of the EC JRC Ageing PSA Network. Several techniques: visual evaluation, nonparametric and parametric hypothesis tests, were proposed and applied in order to demonstrate the capacity, advantages and limitations of statistical approaches to identify the component's ageing by operational data analysis. Engineering considerations are out of the scope of the present study

  5. Identifying the Role of National Digital Cadastral Database (ndcdb) in Malaysia and for Land-Based Analysis

    Science.gov (United States)

    Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Yusof, O. M.; Wazir, M. A. M.; Adimin, M. K.

    2017-10-01

    This paper explains the process carried out in identifying the significant role of NDCDB in Malaysia specifically in the land-based analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the role standpoint. Seven statements pertaining the significant role of NDCDB in Malaysia and land-based analysis were established after three rounds of consensus building. The agreed statements provided a clear definition to describe the important role of NDCDB in Malaysia and for land-based analysis, which was limitedly studied that lead to unclear perception to the general public and even the geospatial community. The connection of the statements with disaster management is discussed concisely at the end of the research.

  6. IDENTIFYING THE ROLE OF NATIONAL DIGITAL CADASTRAL DATABASE (NDCDB IN MALAYSIA AND FOR LAND-BASED ANALYSIS

    Directory of Open Access Journals (Sweden)

    N. Z. A. Halim

    2017-10-01

    Full Text Available This paper explains the process carried out in identifying the significant role of NDCDB in Malaysia specifically in the land-based analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the role standpoint. Seven statements pertaining the significant role of NDCDB in Malaysia and land-based analysis were established after three rounds of consensus building. The agreed statements provided a clear definition to describe the important role of NDCDB in Malaysia and for land-based analysis, which was limitedly studied that lead to unclear perception to the general public and even the geospatial community. The connection of the statements with disaster management is discussed concisely at the end of the research.

  7. The use of environmental monitoring as a technique to identify isotopic enrichment activities

    International Nuclear Information System (INIS)

    Buchmann, Jose Henrique

    2000-01-01

    The use of environmental monitoring as a technique to identify activities related to the nuclear fuel cycle has been proposed, by international organizations, as an additional measure to the safeguards agreements in force. The elements specific for each kind of nuclear activity, or nuclear signatures, inserted in the ecosystem by several transfer paths, can be intercepted with better or worse ability by different live organisms. Depending on the kind of signature of interest, the anthropogenic material identification and quantification require the choice of adequate biologic indicators and, mainly, the use of sophisticated techniques associated with elaborate sample treatments. This work demonstrates the technical viability of using pine needles as bioindicators of nuclear signatures associated with uranium enrichment activities. Additionally, it proposes the use of a technique widely diffused nowadays in the scientific community, the High Resolution Inductively Coupled Plasma Mass Spectrometer (HR-ICP-MS), to identify the signature corresponding to that kind of activities in the ecosystem. It can be also found a description of a methodology recently being applied in analytical chemistry,based on uncertainties estimates metrological concepts, used to calculate the uncertainties associated with the obtained measurement results. Nitric acid solutions with a concentration of 0.3 mol.kg -1 , used to wash pine needles sampled near facilities that manipulate enriched uranium and containing only 0.1 μg.kg -1 of uranium, exhibit a 235 U: 238 U isotopic abundance ratio of 0.0092±0.0002, while solutions originated from samples collected at places located more than 200 km far from activities related to the nuclear fuel cycle exhibit a value of 0.0074±0.0002 for this abundance ratio. Similar results were obtained for samples collected in different places permit to confirm the presence of anthropogenic uranium and demonstrate the viability of using this technique and the

  8. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  9. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  10. On-line diagnostic techniques for air-operated control valves based on time series analysis

    International Nuclear Information System (INIS)

    Ito, Kenji; Matsuoka, Yoshinori; Minamikawa, Shigeru; Komatsu, Yasuki; Satoh, Takeshi.

    1996-01-01

    The objective of this research is to study the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves - numerous valves of the type which are used in PWR plants. Generally the techniques can detect anomalies by failures in the initial stages for which detection is difficult by conventional surveillance of process parameters measured directly. However, the effectiveness of these techniques depends on the system being diagnosed. The difficulties in applying diagnostic techniques to air-operated control valves seem to come from the reduced sensitivity of their response as compared with hydraulic control systems, as well as the need to identify anomalies in low level signals that fluctuate only slightly but continuously. In this research, simulation tests were performed by setting various kinds of failure modes for a test valve with the same specifications as of a valve actually used in the plants. Actual control signals recorded from an operating plant were then used as input signals for simulation. The results of the tests confirmed the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves. (author)

  11. Improved Multiscale Entropy Technique with Nearest-Neighbor Moving-Average Kernel for Nonlinear and Nonstationary Short-Time Biomedical Signal Analysis

    Directory of Open Access Journals (Sweden)

    S. P. Arunachalam

    2018-01-01

    Full Text Available Analysis of biomedical signals can yield invaluable information for prognosis, diagnosis, therapy evaluation, risk assessment, and disease prevention which is often recorded as short time series data that challenges existing complexity classification algorithms such as Shannon entropy (SE and other techniques. The purpose of this study was to improve previously developed multiscale entropy (MSE technique by incorporating nearest-neighbor moving-average kernel, which can be used for analysis of nonlinear and non-stationary short time series physiological data. The approach was tested for robustness with respect to noise analysis using simulated sinusoidal and ECG waveforms. Feasibility of MSE to discriminate between normal sinus rhythm (NSR and atrial fibrillation (AF was tested on a single-lead ECG. In addition, the MSE algorithm was applied to identify pivot points of rotors that were induced in ex vivo isolated rabbit hearts. The improved MSE technique robustly estimated the complexity of the signal compared to that of SE with various noises, discriminated NSR and AF on single-lead ECG, and precisely identified the pivot points of ex vivo rotors by providing better contrast between the rotor core and the peripheral region. The improved MSE technique can provide efficient complexity analysis of variety of nonlinear and nonstationary short-time biomedical signals.

  12. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  13. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  14. Tensometry technique for X-ray diffraction in applied analysis of welding

    International Nuclear Information System (INIS)

    Turibus, S.N.; Caldas, F.C.M.; Miranda, D.M.; Monine, V.I.; Assis, J.T.

    2010-01-01

    This paper presents the analysis of residual stress introduced in welding process. As the stress in a material can induce damages, it is necessary to have a method to identify this residual stress state. For this it was used the non-destructive X-ray diffraction technique to analyze two plates from A36 steel jointed by metal inert gas (MIG) welding. The stress measurements were made by the sin 2 ψ method in weld region of steel plates including analysis of longitudinal and transverse residual stresses in fusion zone, heat affected zone (HAZ) and base metal. To determine the stress distribution along the depth of the welded material it was used removing of superficial layers made by electropolishing. (author)

  15. Wheeze sound analysis using computer-based techniques: a systematic review.

    Science.gov (United States)

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  16. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  17. Identifying and quantifying energy savings on fired plant using low cost modelling techniques

    International Nuclear Information System (INIS)

    Tucker, Robert; Ward, John

    2012-01-01

    Research highlights: → Furnace models based on the zone method for radiation calculation are described. → Validated steady-state and transient models have been developed. → We show how these simple models can identify the best options for saving energy. → High emissivity coatings predicted to give performance enhancement on a fired heater. → Optimal heat recovery strategies on a steel reheating furnace are predicted. -- Abstract: Combustion in fired heaters, boilers and furnaces often accounts for the major energy consumption on industrial processes. Small improvements in efficiency can result in large reductions in energy consumption, CO 2 emissions, and operating costs. This paper will describe some useful low cost modelling techniques based on the zone method to help identify energy saving opportunities on high temperature fuel-fired process plant. The zone method has for many decades, been successfully applied to small batch furnaces through to large steel-reheating furnaces, glass tanks, boilers and fired heaters on petrochemical plant. Zone models can simulate both steady-state furnace operation and more complex transient operation typical of a production environment. These models can be used to predict thermal efficiency and performance, and more importantly, to assist in identifying and predicting energy saving opportunities from such measures as: ·Improving air/fuel ratio and temperature controls. ·Improved insulation. ·Use of oxygen or oxygen enrichment. ·Air preheating via flue gas heat recovery. ·Modification to furnace geometry and hearth loading. There is also increasing interest in the application of refractory coatings for increasing surface radiation in fired plant. All of the techniques can yield savings ranging from a few percent upwards and can deliver rapid financial payback, but their evaluation often requires robust and reliable models in order to increase confidence in making financial investment decisions. This paper gives

  18. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  19. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  20. Human errors identification using the human factors analysis and classification system technique (HFACS

    Directory of Open Access Journals (Sweden)

    G. A. Shirali

    2013-12-01

    .Result: In this study, 158 reports of accident in Ahvaz steel industry were analyzed by HFACS technique. This analysis showed that most of the human errors were: in the first level was related to the skill-based errors, in the second to the physical environment, in the third level to the inadequate supervision and in the fourth level to the management of resources. .Conclusion: Studying and analyzing of past events using the HFACS technique can identify the major and root causes of accidents and can be effective on prevent repetitions of such mishaps. Also, it can be used as a basis for developing strategies to prevent future events in steel industries.

  1. An Effective Performance Analysis of Machine Learning Techniques for Cardiovascular Disease

    Directory of Open Access Journals (Sweden)

    Vinitha DOMINIC

    2015-03-01

    Full Text Available Machine learning techniques will help in deriving hidden knowledge from clinical data which can be of great benefit for society, such as reduce the number of clinical trials required for precise diagnosis of a disease of a person etc. Various areas of study are available in healthcare domain like cancer, diabetes, drugs etc. This paper focuses on heart disease dataset and how machine learning techniques can help in understanding the level of risk associated with heart diseases. Initially, data is preprocessed then analysis is done in two stages, in first stage feature selection techniques are applied on 13 commonly used attributes and in second stage feature selection techniques are applied on 75 attributes which are related to anatomic structure of the heart like blood vessels of the heart, arteries etc. Finally, validation of the reduced set of features using an exhaustive list of classifiers is done.In parallel study of the anatomy of the heart is done using the identified features and the characteristics of each class is understood. It is observed that these reduced set of features are anatomically relevant. Thus, it can be concluded that, applying machine learning techniques on clinical data is beneficial and necessary.

  2. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  3. Thermoluminescence analysis can identify irradiated ingredient in soy sauce before and after pasteurization

    International Nuclear Information System (INIS)

    Lee, Jeong-Eun; Sanyal, Bhaskar; Akram, Kashif; Jo, Yunhee; Baek, Ji-Yeong; Kwon, Joong-Ho

    2017-01-01

    Thermoluminescence (TL) analysis was conducted to identify small quantities (0.5%, 1%, and 1.5%) of γ ray-or electron beam-irradiated garlic powder in a soy sauce after commercial pasteurization. The sauce samples with γ ray- and electron beam-irradiated (0, 1 or 10 kGy) garlic powder showed detectable TL glow curves, characterized by radiation-induced maximum in the temperature range of 180–225 °C. The successful identification of soy sauces with an irradiation history was dependent on both the mixing ratio of the irradiated ingredient and the irradiation dose. Post-irradiation pasteurization (85 °C, 30 min) caused no considerable changes in TL glow shape or intensity. Interlaboratory tests demonstrated that the shape and intensity of the first TL glow curve (TL1) could be a better detection marker than a TL ratio (TL1/TL2). - Highlights: • Thermoluminescence (TL) characteristics were studied to identify irradiated ingredient in soy sauce. • TL emission was found to be dependent on irradiation doses and blending ratios of the ingredients. • TL technique was found to be successful in detecting irradiation status even after pasteurization. • Inter-laboratory trial gave a clear verdict on irradiation detection potential of TL technique.

  4. Techniques for the thermal/hydraulic analysis of LMFBR check valves

    International Nuclear Information System (INIS)

    Cho, S.M.; Kane, R.S.

    1979-01-01

    A thermal/hydraulic analysis of the check valves in liquid sodium service for LMFBR plants is required to provide temperature data for thermal stress analysis of the valves for specified transient conditions. Because of the complex three-dimensional flow pattern within the valve, the heat transfer analysis techniques for less complicated shapes could not be used. This paper discusses the thermal analysis techniques used to assure that the valve stress analysis is conservative. These techniques include a method for evaluating the recirculating flow patterns and for selecting appropriately conservative heat transfer correlations in various regions of the valve

  5. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  6. To what extent can behaviour change techniques be identified within an adaptable implementation package for primary care? A prospective directed content analysis.

    Science.gov (United States)

    Glidewell, Liz; Willis, Thomas A; Petty, Duncan; Lawton, Rebecca; McEachan, Rosemary R C; Ingleson, Emma; Heudtlass, Peter; Davies, Andrew; Jamieson, Tony; Hunter, Cheryl; Hartley, Suzanne; Gray-Burrows, Kara; Clamp, Susan; Carder, Paul; Alderson, Sarah; Farrin, Amanda J; Foy, Robbie

    2018-02-17

    Interpreting evaluations of complex interventions can be difficult without sufficient description of key intervention content. We aimed to develop an implementation package for primary care which could be delivered using typically available resources and could be adapted to target determinants of behaviour for each of four quality indicators: diabetes control, blood pressure control, anticoagulation for atrial fibrillation and risky prescribing. We describe the development and prospective verification of behaviour change techniques (BCTs) embedded within the adaptable implementation packages. We used an over-lapping multi-staged process. We identified evidence-based, candidate delivery mechanisms-mainly audit and feedback, educational outreach and computerised prompts and reminders. We drew upon interviews with primary care professionals using the Theoretical Domains Framework to explore likely determinants of adherence to quality indicators. We linked determinants to candidate BCTs. With input from stakeholder panels, we prioritised likely determinants and intervention content prior to piloting the implementation packages. Our content analysis assessed the extent to which embedded BCTs could be identified within the packages and compared them across the delivery mechanisms and four quality indicators. Each implementation package included at least 27 out of 30 potentially applicable BCTs representing 15 of 16 BCT categories. Whilst 23 BCTs were shared across all four implementation packages (e.g. BCTs relating to feedback and comparing behaviour), some BCTs were unique to certain delivery mechanisms (e.g. 'graded tasks' and 'problem solving' for educational outreach). BCTs addressing the determinants 'environmental context' and 'social and professional roles' (e.g. 'restructuring the social and 'physical environment' and 'adding objects to the environment') were indicator specific. We found it challenging to operationalise BCTs targeting 'environmental context

  7. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  8. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-03-22

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018. Published by Elsevier Inc.

  9. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Mini-DIAL system measurements coupled with multivariate data analysis to identify TIC and TIM simulants: preliminary absorption database analysis

    International Nuclear Information System (INIS)

    Gaudio, P; Malizia, A; Gelfusa, M; Poggi, L.A.; Martinelli, E.; Di Natale, C.; Bellecci, C.

    2017-01-01

    Nowadays Toxic Industrial Components (TICs) and Toxic Industrial Materials (TIMs) are one of the most dangerous and diffuse vehicle of contamination in urban and industrial areas. The academic world together with the industrial and military one are working on innovative solutions to monitor the diffusion in atmosphere of such pollutants. In this phase the most common commercial sensors are based on “point detection” technology but it is clear that such instruments cannot satisfy the needs of the smart cities. The new challenge is developing stand-off systems to continuously monitor the atmosphere. Quantum Electronics and Plasma Physics (QEP) research group has a long experience in laser system development and has built two demonstrators based on DIAL (Differential Absorption of Light) technology could be able to identify chemical agents in atmosphere. In this work the authors will present one of those DIAL system, the miniaturized one, together with the preliminary results of an experimental campaign conducted on TICs and TIMs simulants in cell with aim of use the absorption database for the further atmospheric an analysis using the same DIAL system. The experimental results are analysed with standard multivariate data analysis technique as Principal Component Analysis (PCA) to develop a classification model aimed at identifying organic chemical compound in atmosphere. The preliminary results of absorption coefficients of some chemical compound are shown together pre PCA analysis. (paper)

  11. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  12. Application of the INAA technique for elemental analysis of metallic biomaterials used in dentistry

    International Nuclear Information System (INIS)

    Cincu, Em; Craciun, L.; Manea-Grigore, Ioana; Cazan, I.L.; Manu, V.; Barbos, D.; Cocis, A.

    2009-01-01

    The sensitive nuclear analytical technique Instrumental Neutron Activation Analysis (INAA) has been applied on several types of metallic biomaterials (Heraenium CE, Ventura Nibon, Wiron 99 and Ducinox which are currently used for restoration in the dental clinics) to study its performance in elemental analysis and identify eventual limitations. The investigation has been performed by two NAA Laboratories and aimed at getting an answer to the question on how the biomaterials compositions influence the patients' health over the course of time, taking into account the EC Directive 94/27/EC recommendations concerning Ni toxicity.

  13. Identifying Importance-Performance Matrix Analysis (IPMA) of ...

    African Journals Online (AJOL)

    Identifying Importance-Performance Matrix Analysis (IPMA) of intellectual capital and Islamic work ethics in Malaysian SMES. ... capital and Islamic work ethics significantly influenced business performance. ... AJOL African Journals Online.

  14. Identifying sources of atmospheric fine particles in Havana City using Positive Matrix Factorization technique

    International Nuclear Information System (INIS)

    Pinnera, I.; Perez, G.; Ramos, M.; Guibert, R.; Aldape, F.; Flores M, J.; Martinez, M.; Molina, E.; Fernandez, A.

    2011-01-01

    In previous study a set of samples of fine and coarse airborne particulate matter collected in a urban area of Havana City were analyzed by Particle-Induced X-ray Emission (PIXE) technique. The concentrations of 14 elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br and Pb) were consistently determined in both particle sizes. The analytical database provided by PIXE was statistically analyzed in order to determine the local pollution sources. The Positive Matrix Factorization (PMF) technique was applied to fine particle data in order to identify possible pollution sources. These sources were further verified by enrichment factor (EF) calculation. A general discussion about these results is presented in this work. (Author)

  15. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  16. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  17. Development of image analysis for graphite pore-structure determination using fluorescence techniques

    International Nuclear Information System (INIS)

    Stephen, W.J.; Bowden, E.A.T.; Wickham, A.J.

    1983-03-01

    The use of image analysis to assess the pore structure of graphite has been developed to the point at which it may be considered available for routine use. A definitive pore structure in terms of the geometry-independent ''characteristic pore dimension'' is derived from the computer analysis of polished specimens whose open-pore structure has been impregnated with bismuth or a fluorescent epoxy resin, with the very small pores identified separately by mercury porosimetry as in the past. The pore-size distributions obtained from these combined techniques have been used successfully to predict the corrosion rates of nine graphites, of widely differing pore structure, in a variety of gas compositions and, indirectly, to confirm appropriate mean ranges and rate constants for the reaction of the oxidising species in these gas mixtures. The development of the fluorescent-impregnant technique is discussed in detail and its use is justified in preference to ''traditional'' methods. Further possible refinements are discussed, including the eventual aim of obtaining a computer prediction of the future oxidation behaviour of the graphite directly from the image analyser. (author)

  18. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  19. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  20. Testing the potential of geochemical techniques for identifying hydrological systems within landslides in partly weathered marls

    Science.gov (United States)

    Bogaard, T. A.; Buma, J. T.; Klawer, C. J. M.

    2004-03-01

    This paper's objective is to determine how useful geochemistry can be in landslide investigations. More specifically, what additional information can be gained by analysing the cation exchange capacity (CEC) and cation composition in respect to the hydrological system of a landslide area in clayey material. Two cores from the Boulc-Mondorès landslide (France) and one core from the Alvera landslide (Italy) were analysed. The NH 4Ac and NaCl laboratory techniques are tested. The geochemical results are compared with the core descriptions and interpreted with respect to their usefulness. Both analysis techniques give identical results for CEC, and are plausible on the basis of the available clay content information. The determination of the exchangeable cations was more difficult, since part of the marls dissolved. With the ammonium-acetate method more of the marls are dissolved than with the sodium-chloride method. The NaCl method is preferred for the determination of the cation fractions at the complex, be it that this method has the disadvantage that the sodium fraction cannot be determined. To overcome this problem, it is recommended to try other displacement fluids. In the Boulc-Mondorès example, the subsurface information that can be extracted from CEC analyses was presented. In the Boulc-Mondorès cores deviant intervals of CEC could be identified. These are interpreted as weathered layers (and preferential flow paths) that may develop or have already developed into slip surfaces. The major problem of the CEC analyses was to explain the origin of the differences found in the core samples. Both Alvera and Boulc-Mondorès examples show transitions in cation composition with depth. It was shown that the exchangeable caution fractions can be useful in locating boundaries between water types, especially the boundary between the superficial, rain-fed hydrological system and the lower, regional groundwater system. This information may be important for landslide

  1. Identifying plant cell-surface receptors: combining 'classical' techniques with novel methods.

    Science.gov (United States)

    Uebler, Susanne; Dresselhaus, Thomas

    2014-04-01

    Cell-cell communication during development and reproduction in plants depends largely on a few phytohormones and many diverse classes of polymorphic secreted peptides. The peptide ligands are bound at the cell surface of target cells by their membranous interaction partners representing, in most cases, either receptor-like kinases or ion channels. Although knowledge of both the extracellular ligand and its corresponding receptor(s) is necessary to describe the downstream signalling pathway(s), to date only a few ligand-receptor pairs have been identified. Several methods, such as affinity purification and yeast two-hybrid screens, have been used very successfully to elucidate interactions between soluble proteins, but most of these methods cannot be applied to membranous proteins. Experimental obstacles such as low concentration and poor solubility of membrane receptors, as well as instable transient interactions, often hamper the use of these 'classical' approaches. However, over the last few years, a lot of progress has been made to overcome these problems by combining classical techniques with new methodologies. In the present article, we review the most promising recent methods in identifying cell-surface receptor interactions, with an emphasis on success stories outside the field of plant research.

  2. On structural identifiability analysis of the cascaded linear dynamic systems in isotopically non-stationary 13C labelling experiments.

    Science.gov (United States)

    Lin, Weilu; Wang, Zejian; Huang, Mingzhi; Zhuang, Yingping; Zhang, Siliang

    2018-06-01

    The isotopically non-stationary 13C labelling experiments, as an emerging experimental technique, can estimate the intracellular fluxes of the cell culture under an isotopic transient period. However, to the best of our knowledge, the issue of the structural identifiability analysis of non-stationary isotope experiments is not well addressed in the literature. In this work, the local structural identifiability analysis for non-stationary cumomer balance equations is conducted based on the Taylor series approach. The numerical rank of the Jacobian matrices of the finite extended time derivatives of the measured fractions with respect to the free parameters is taken as the criterion. It turns out that only one single time point is necessary to achieve the structural identifiability analysis of the cascaded linear dynamic system of non-stationary isotope experiments. The equivalence between the local structural identifiability of the cascaded linear dynamic systems and the local optimum condition of the nonlinear least squares problem is elucidated in the work. Optimal measurements sets can then be determined for the metabolic network. Two simulated metabolic networks are adopted to demonstrate the utility of the proposed method. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  4. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  5. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  6. A Technique of Software Safety Analysis in the Design Phase for PLC Based Safety-Critical Systems

    International Nuclear Information System (INIS)

    Koo, Seo-Ryong; Kim, Chang-Hwoi

    2017-01-01

    The purpose of safety analysis, which is a method of identifying portions of a system that have the potential for unacceptable hazards, is firstly to encourage design changes that will reduce or eliminate hazards and, secondly, to conduct special analyses and tests that can provide increased confidence in especially vulnerable portions of the system. For the design and implementation phase of the PLC based systems, we proposed a technique for software design specification and analysis, and this technique enables us to generate software design specifications (SDSs) in nuclear fields. For the safety analysis in the design phase, we used architecture design blocks of NuFDS to represent the architecture of the software. On the basis of the architecture design specification, we can directly generate the fault tree and then use the fault tree for qualitative analysis. Therefore, we proposed a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Through our proposed fault tree synthesis in this work, users can use the architecture specification of the NuFDS approach to intuitively compose fault trees that help analyze the safety design features of software.

  7. Using text-mining techniques in electronic patient records to identify ADRs from medicine use.

    Science.gov (United States)

    Warrer, Pernille; Hansen, Ebba Holme; Juhl-Jensen, Lars; Aagaard, Lise

    2012-05-01

    This literature review included studies that use text-mining techniques in narrative documents stored in electronic patient records (EPRs) to investigate ADRs. We searched PubMed, Embase, Web of Science and International Pharmaceutical Abstracts without restrictions from origin until July 2011. We included empirically based studies on text mining of electronic patient records (EPRs) that focused on detecting ADRs, excluding those that investigated adverse events not related to medicine use. We extracted information on study populations, EPR data sources, frequencies and types of the identified ADRs, medicines associated with ADRs, text-mining algorithms used and their performance. Seven studies, all from the United States, were eligible for inclusion in the review. Studies were published from 2001, the majority between 2009 and 2010. Text-mining techniques varied over time from simple free text searching of outpatient visit notes and inpatient discharge summaries to more advanced techniques involving natural language processing (NLP) of inpatient discharge summaries. Performance appeared to increase with the use of NLP, although many ADRs were still missed. Due to differences in study design and populations, various types of ADRs were identified and thus we could not make comparisons across studies. The review underscores the feasibility and potential of text mining to investigate narrative documents in EPRs for ADRs. However, more empirical studies are needed to evaluate whether text mining of EPRs can be used systematically to collect new information about ADRs. © 2011 The Authors. British Journal of Clinical Pharmacology © 2011 The British Pharmacological Society.

  8. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    Science.gov (United States)

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  9. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  10. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  11. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  12. Cellular signaling identifiability analysis: a case study.

    Science.gov (United States)

    Roper, Ryan T; Pia Saccomani, Maria; Vicini, Paolo

    2010-05-21

    Two primary purposes for mathematical modeling in cell biology are (1) simulation for making predictions of experimental outcomes and (2) parameter estimation for drawing inferences from experimental data about unobserved aspects of biological systems. While the former purpose has become common in the biological sciences, the latter is less common, particularly when studying cellular and subcellular phenomena such as signaling-the focus of the current study. Data are difficult to obtain at this level. Therefore, even models of only modest complexity can contain parameters for which the available data are insufficient for estimation. In the present study, we use a set of published cellular signaling models to address issues related to global parameter identifiability. That is, we address the following question: assuming known time courses for some model variables, which parameters is it theoretically impossible to estimate, even with continuous, noise-free data? Following an introduction to this problem and its relevance, we perform a full identifiability analysis on a set of cellular signaling models using DAISY (Differential Algebra for the Identifiability of SYstems). We use our analysis to bring to light important issues related to parameter identifiability in ordinary differential equation (ODE) models. We contend that this is, as of yet, an under-appreciated issue in biological modeling and, more particularly, cell biology. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  13. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  14. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  15. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  16. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  17. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  18. Mechanisms of subsidence for induced damage and techniques for analysis

    International Nuclear Information System (INIS)

    Drumm, E.C.; Bennett, R.M.; Kane, W.F.

    1988-01-01

    Structural damage due to mining induced subsidence is a function of the nature of the structure and its position on the subsidence profile. A point on the profile may be in the tensile zone, the compressive zone, or the no-deformation zone at the bottom of the profile. Damage to structures in the tension zone is primarily due to a reduction of support during vertical displacement of the ground surface, and to shear stresses between the soil and structure resulting from horizontal displacements. The damage mechanisms due to tension can be investigated effectively using a two-dimensional plane stress analysis. Structures in the compression zone are subjected to positive moments in the footing and large compressive horizontal stresses in the foundation walls. A plane strain analysis of the foundation wall is utilized to examine compression zone damage mechanisms. The structural aspects affecting each mechanism are identified and potential mitigation techniques are summarized

  19. Vibration impact acoustic emission technique for identification and analysis of defects in carbon steel tubes: Part A Statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halim, Zakiah Abd [Universiti Teknikal Malaysia Melaka (Malaysia); Jamaludin, Nordin; Junaidi, Syarif [Faculty of Engineering and Built, Universiti Kebangsaan Malaysia, Bangi (Malaysia); Yahya, Syed Yusainee Syed [Universiti Teknologi MARA, Shah Alam (Malaysia)

    2015-04-15

    Current steel tubes inspection techniques are invasive, and the interpretation and evaluation of inspection results are manually done by skilled personnel. This paper presents a statistical analysis of high frequency stress wave signals captured from a newly developed noninvasive, non-destructive tube inspection technique known as the vibration impact acoustic emission (VIAE) technique. Acoustic emission (AE) signals have been introduced into the ASTM A179 seamless steel tubes using an impact hammer, and the AE wave propagation was captured using an AE sensor. Specifically, a healthy steel tube as the reference tube and four steel tubes with through-hole artificial defect at different locations were used in this study. The AE features extracted from the captured signals are rise time, peak amplitude, duration and count. The VIAE technique also analysed the AE signals using statistical features such as root mean square (r.m.s.), energy, and crest factor. It was evident that duration, count, r.m.s., energy and crest factor could be used to automatically identify the presence of defect in carbon steel tubes using AE signals captured using the non-invasive VIAE technique.

  20. Using text-mining techniques in electronic patient records to identify ADRs from medicine use

    DEFF Research Database (Denmark)

    Warrer, Pernille; Hansen, Ebba Holme; Jensen, Lars Juhl

    2012-01-01

    This literature review included studies that use text-mining techniques in narrative documents stored in electronic patient records (EPRs) to investigate ADRs. We searched PubMed, Embase, Web of Science and International Pharmaceutical Abstracts without restrictions from origin until July 2011. We...... included empirically based studies on text mining of electronic patient records (EPRs) that focused on detecting ADRs, excluding those that investigated adverse events not related to medicine use. We extracted information on study populations, EPR data sources, frequencies and types of the identified ADRs......, medicines associated with ADRs, text-mining algorithms used and their performance. Seven studies, all from the United States, were eligible for inclusion in the review. Studies were published from 2001, the majority between 2009 and 2010. Text-mining techniques varied over time from simple free text...

  1. Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.

    Science.gov (United States)

    Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B

    2012-09-15

    Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  3. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  4. Development of safety analysis and constraint detection techniques for process interaction errors

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Chin-Feng, E-mail: csfanc@saturn.yzu.edu.tw [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China); Tsai, Shang-Lin; Tseng, Wan-Hui [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China)

    2011-02-15

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  5. Development of safety analysis and constraint detection techniques for process interaction errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Tsai, Shang-Lin; Tseng, Wan-Hui

    2011-01-01

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  6. The palisade cartilage tympanoplasty technique: a systematic review and meta-analysis.

    Science.gov (United States)

    Jeffery, Caroline C; Shillington, Cameron; Andrews, Colin; Ho, Allan

    2017-06-17

    Tympanoplasty is a common procedure performed by Otolaryngologists. Many types of autologous grafts have been used with variations of techniques with varying results. This is the first systematic review of the literature and meta-analysis with the aim to evaluate the effectiveness of one of the techniques which is gaining popularity, the palisade cartilage tympanoplasty. PubMed, EMBASE, and Cochrane databases were searched for "palisade", "cartilage", "tympanoplasty", "perforation" and their synonyms. In total, 199 articles reporting results of palisade cartilage tympanoplasty were identified. Five articles satisfied the following inclusion criteria: adult patients, minimum 6 months follow-up, hearing and surgical outcomes reported. Studies with patients undergoing combined mastoidectomy, ossicular chain reconstruction, and/or other middle ear surgery were excluded. Perforation closure, rate of complications, and post-operative pure-tone average change were extracted for pooled analysis. Study failure and complication proportions that were used to generate odds ratios were pooled. Fixed effects and random effects weightings were generated. The resulting pooled odds ratios are reported. Palisade cartilage tympanoplasty has an overall take rate of 96% at beyond 6 months and has similar odds of complications compared to temporalis fascia (OR 0.89, 95% CI 0.62, 1.30). The air-bone gap closure is statistically similar to reported results from temporalis fascia tympanoplasty. Cartilage palisade tympanoplasty offers excellent graft take rates and good postoperative hearing outcomes for perforations of various sizes and for both primary and revision cases. This technique has predictable, long-term results with low complication rates, similar to temporalis fascia tympanoplasty.

  7. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  8. Using Job Analysis Techniques to Understand Training Needs for Promotores de Salud.

    Science.gov (United States)

    Ospina, Javier H; Langford, Toshiko A; Henry, Kimberly L; Nelson, Tristan Q

    2018-04-01

    Despite the value of community health worker programs, such as Promotores de Salud, for addressing health disparities in the Latino community, little consensus has been reached to formally define the unique roles and duties associated with the job, thereby creating unique job training challenges. Understanding the job tasks and worker attributes central to this work is a critical first step for developing the training and evaluation systems of promotores programs. Here, we present the process and findings of a job analysis conducted for promotores working for Planned Parenthood. We employed a systematic approach, the combination job analysis method, to define the job in terms of its work and worker requirements, identifying key job tasks, as well as the worker attributes necessary to effectively perform them. Our results suggest that the promotores' job encompasses a broad range of activities and requires an equally broad range of personal characteristics to perform. These results played an important role in the development of our training and evaluation protocols. In this article, we introduce the technique of job analysis, provide an overview of the results from our own application of this technique, and discuss how these findings can be used to inform a training and performance evaluation system. This article provides a template for other organizations implementing similar community health worker programs and illustrates the value of conducting a job analysis for clarifying job roles, developing and evaluating job training materials, and selecting qualified job candidates.

  9. Identifying target processes for microbial electrosynthesis by elementary mode analysis.

    Science.gov (United States)

    Kracke, Frauke; Krömer, Jens O

    2014-12-30

    Microbial electrosynthesis and electro fermentation are techniques that aim to optimize microbial production of chemicals and fuels by regulating the cellular redox balance via interaction with electrodes. While the concept is known for decades major knowledge gaps remain, which make it hard to evaluate its biotechnological potential. Here we present an in silico approach to identify beneficial production processes for electro fermentation by elementary mode analysis. Since the fundamentals of electron transport between electrodes and microbes have not been fully uncovered yet, we propose different options and discuss their impact on biomass and product yields. For the first time 20 different valuable products were screened for their potential to show increased yields during anaerobic electrically enhanced fermentation. Surprisingly we found that an increase in product formation by electrical enhancement is not necessarily dependent on the degree of reduction of the product but rather the metabolic pathway it is derived from. We present a variety of beneficial processes with product yield increases of maximal 36% in reductive and 84% in oxidative fermentations and final theoretical product yields up to 100%. This includes compounds that are already produced at industrial scale such as succinic acid, lysine and diaminopentane as well as potential novel bio-commodities such as isoprene, para-hydroxybenzoic acid and para-aminobenzoic acid. Furthermore, it is shown that the way of electron transport has major impact on achievable biomass and product yields. The coupling of electron transport to energy conservation could be identified as crucial for most processes. This study introduces a powerful tool to determine beneficial substrate and product combinations for electro-fermentation. It also highlights that the maximal yield achievable by bio electrochemical techniques depends strongly on the actual electron transport mechanisms. Therefore it is of great importance to

  10. Rapid nuclear forensics analysis via laser based microphotonic techniques coupled with chemometrics

    International Nuclear Information System (INIS)

    Bhatta, B.; Kalambuka, H.A.; Dehayem-Kamadjeu, A.

    2017-01-01

    Nuclear forensics (NF) is an important tool for analysis and attribution of nuclear and radiological materials (NRM) in support of nuclear security. The critical challenge in NF currently is the lack of suitable microanalytical methodologies for direct, rapid and minimally-invasive detection and quantification of NF signatures. Microphotonic techniques can achieve this task particularly when the materials are of limited size and under concealed condition. The purpose of this paper is to demonstrate the combined potential of chemometrics enabled LIBS and laser Raman spectromicroscopy (LRS) for rapid NF analysis and attribution. Using LIBS, uranium lines at 385.464 nm, 385.957 nm and 386.592 nm were identified as NF signatures in uranium ore surrogates. A multivariate calibration strategy using artificial neural network was developed for quantification of trace uranium. Principal component analysis (PCA) of LIBS spectra achieved source attribution of the ores. LRS studies on UCl3, UO3(NO3)2.6H2O, UO2SO4.3H2O and UO3 in pellet state identified the bands associated with different uranium molecules as varying in the range of (840 to 867) ± 15 cm-1. Using this signature, we have demonstrated spectral imaging of uranium under concealed conditions (author)

  11. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  12. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  13. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  14. Noise diagnostic: An advanced technique in Cuba

    International Nuclear Information System (INIS)

    Aguilar, O.

    1992-01-01

    This paper examines the main steps of the noise analysis technique implementation in our country from 1988. The review identifies two main areas, improvements of Nuclear Power Plant operational surveillance techniques and non-nuclear industrial applications. Also reported are some of the on going researches programs including projects on noise analysis instrumentation developments at the Higher Institute for Nuclear Sciences and Technology

  15. Identify the Effective Wells in Determination of Groundwater Depth in Urmia Plain Using Principle Component Analysis

    Directory of Open Access Journals (Sweden)

    Sahar Babaei Hessar

    2017-06-01

    Full Text Available Introduction: Groundwater is the most important resource of providing sanitary water for potable and household consumption. So continuous monitoring of groundwater level will play an important role in water resource management. But because of the large amount of information, evaluation of water table is a costly and time consuming process. Therefore, in many studies, the data and information aren’t suitable and useful and so, must be neglected. The PCA technique is an optimized mathematical method that reserve data with the highest share in affirming variance with recognizing less important data and limits the original variables into to a few components. In this technique, variation factors called principle components are identified with considering data structures. Thus, variables those have the highest correlation coefficient with principal components are extracted as a result of identifying the components that create the greatest variance. Materials and Methods: The study region has an area of approximately 962 Km2 and area located between 37º 21´ N to 37º 49´ N and 44º 57´ E to 45º 16´ E in West Azerbaijan province of Iran. This area placed along the mountainous north-west of the country, which ends with the plane Urmia Lake and has vast groundwater resources. However, recently the water table has been reduced considerably because of the exceeded exploitation as a result of urbanization and increased agricultural and horticultural land uses. In the present study, the annual water table datasets in 51wells monitored by Ministry of Energy during statistical periods of 2002-2011 were used to data analysis. In order to identify the effective wells in determination of groundwater level, the PCA technique was used. In this research to compute the relative importance of each well, 10 wells were identified with the nearest neighbor for each one. The number of wells (p as a general rule must be less or equal to the maximum number of

  16. Data management and data analysis techniques in pharmacoepidemiological studies using a pre-planned multi-database approach : a systematic literature review

    NARCIS (Netherlands)

    Bazelier, Marloes T; Eriksson, Irene; de Vries, Frank; Schmidt, Marjanka K; Raitanen, Jani; Haukka, Jari; Starup-Linde, Jakob; De Bruin, Marie L; Andersen, Morten

    2015-01-01

    PURPOSE: To identify pharmacoepidemiological multi-database studies and to describe data management and data analysis techniques used for combining data. METHODS: Systematic literature searches were conducted in PubMed and Embase complemented by a manual literature search. We included

  17. Biomonitoring of air pollution in Jamaica through trace-element analysis of epiphytic plants using nuclear and related analytical techniques

    International Nuclear Information System (INIS)

    Vutchkov, Mitko

    2001-01-01

    The main goal of the Coordinated Research Project (No:9937/R0), entitled 'Biomonitoring of Air Pollution in Jamaica Through Trace-Element Analysis of Epiphytic Plants Using Nuclear and Related Analytical Techniques', is to identify and validate site specific epiphytic plants for biomonitoring the atmospheric pollution in Jamaica using nuclear analytical techniques at the International Centre for Environmental and Nuclear Sciences (ICENS). The specific objectives for the second year of the project were: Development of HOP for sampling epiphytic plants in Jamaica; Sampling design and sample collection; Sample preparation and analysis; Development of an in-house SRM and participation in the NAT-5 inter-laboratory study; Data analysis and interpretation of the results; Development of a work plan of the third year of the project

  18. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  19. Low level radioactivity measurements with phoswich detectors using coincident techniques and digital pulse processing analysis.

    Science.gov (United States)

    de la Fuente, R; de Celis, B; del Canto, V; Lumbreras, J M; de Celis Alonso, B; Martín-Martín, A; Gutierrez-Villanueva, J L

    2008-10-01

    A new system has been developed for the detection of low radioactivity levels of fission products and actinides using coincidence techniques. The device combines a phoswich detector for alpha/beta/gamma-ray recognition with a fast digital card for electronic pulse analysis. The phoswich can be used in a coincident mode by identifying the composed signal produced by the simultaneous detection of alpha/beta particles and X-rays/gamma particles. The technique of coincidences with phoswich detectors was proposed recently to verify the Nuclear Test Ban Treaty (NTBT) which established the necessity of monitoring low levels of gaseous fission products produced by underground nuclear explosions. With the device proposed here it is possible to identify the coincidence events and determine the energy and type of coincident particles. The sensitivity of the system has been improved by employing liquid scintillators and a high resolution low energy germanium detector. In this case it is possible to identify simultaneously by alpha/gamma coincidence transuranic nuclides present in environmental samples without necessity of performing radiochemical separation. The minimum detectable activity was estimated to be 0.01 Bq kg(-1) for 0.1 kg of soil and 1000 min counting.

  20. Low level radioactivity measurements with phoswich detectors using coincident techniques and digital pulse processing analysis

    International Nuclear Information System (INIS)

    Fuente, R. de la; Celis, B. de; Canto, V. del; Lumbreras, J.M.; Celis, Alonso B. de; Martin-Martin, A.; Gutierrez-Villanueva, J.L.

    2008-01-01

    A new system has been developed for the detection of low radioactivity levels of fission products and actinides using coincidence techniques. The device combines a phoswich detector for α/β/γ-ray recognition with a fast digital card for electronic pulse analysis. The phoswich can be used in a coincident mode by identifying the composed signal produced by the simultaneous detection of α/β particles and X-rays/γ particles. The technique of coincidences with phoswich detectors was proposed recently to verify the Nuclear Test Ban Treaty (NTBT) which established the necessity of monitoring low levels of gaseous fission products produced by underground nuclear explosions. With the device proposed here it is possible to identify the coincidence events and determine the energy and type of coincident particles. The sensitivity of the system has been improved by employing liquid scintillators and a high resolution low energy germanium detector. In this case it is possible to identify simultaneously by α/γ coincidence transuranic nuclides present in environmental samples without necessity of performing radiochemical separation. The minimum detectable activity was estimated to be 0.01 Bq kg -1 for 0.1 kg of soil and 1000 min counting

  1. Neutron activation analysis techniques for identifying elemental status in Alzheimer's disease

    International Nuclear Information System (INIS)

    Ward, N.I.; Mason, J.A.

    1986-01-01

    Brain tissue (hippocampus and cerebral cortex) from Alzheimer's disease and control individuals sampled from Eastern Canada and the United Kingdom were analyzed for Ag, Al, As, B, Br, Ca, Cd, Co, Cr, Cs, Cu, Fe, Hg, I, K, La, Mg, Mn, Mo, Ni, Rb, S, Sb, Sc, Se, Si, Sn, Sr, Ti, V and Zn. Neutron activation analysis (thermal and prompt gamma-ray) methods were used. Very highly significant differences (S**: probability less than 0.005) for both study areas were shown between Alzheimer's disease (AD) and control (C) individuals: AD>C for Al, Br, Ca and S, and AD< C for Se, V and Zn. Aluminium content of brain tissue ranged form 3.605 to 21.738 μg/g d.w. (AD) and 0.379 to 4.768 μg/g d.w. (C). No statistical evidence of aluminium accumulation with age was noted. Possible zinc deficiency (especially for hippocampal tissue), was observed with zinc ranges of 31.42 to 57.91 μg/g d.w. (AD) and 37.31 to 87.10 μg/g d.w. (C), for Alzheimer's disease patients. (author)

  2. Effective self-regulation change techniques to promote mental wellbeing among adolescents: a meta-analysis.

    Science.gov (United States)

    van Genugten, Lenneke; Dusseldorp, Elise; Massey, Emma K; van Empelen, Pepijn

    2017-03-01

    Mental wellbeing is influenced by self-regulation processes. However, little is known on the efficacy of change techniques based on self-regulation to promote mental wellbeing. The aim of this meta-analysis is to identify effective self-regulation techniques (SRTs) in primary and secondary prevention interventions on mental wellbeing in adolescents. Forty interventions were included in the analyses. Techniques were coded into nine categories of SRTs. Meta-analyses were conducted to identify the effectiveness of SRTs, examining three different outcomes: internalising behaviour, externalising behaviour, and self-esteem. Primary interventions had a small-to-medium ([Formula: see text] = 0.16-0.29) on self-esteem and internalising behaviour. Secondary interventions had a medium-to-large short-term effect (average [Formula: see text] = 0.56) on internalising behaviour and self-esteem. In secondary interventions, interventions including asking for social support [Formula: see text] 95% confidence interval, CI = 1.11-1.98) had a great effect on internalising behaviour. Interventions including monitoring and evaluation had a greater effect on self-esteem [Formula: see text] 95% CI = 0.21-0.57). For primary interventions, there was not a single SRT that was associated with a greater intervention effect on internalising behaviour or self-esteem. No effects were found for externalising behaviours. Self-regulation interventions are moderately effective at improving mental wellbeing among adolescents. Secondary interventions promoting 'asking for social support' and promoting 'monitoring and evaluation' were associated with improved outcomes. More research is needed to identify other SRTs or combinations of SRTs that could improve understanding or optimise mental wellbeing interventions.

  3. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  4. Quantitative mineralogical analysis of sandstones using x-ray diffraction techniques

    International Nuclear Information System (INIS)

    Ward, C.R.; Taylor, J.C.

    1999-01-01

    Full text: X-ray diffraction has long been used as a definitive technique for mineral identification based on the measuring the internal atomic or crystal structures present in powdered rocks; soils and other mineral mixtures. Recent developments in data gathering and processing, however, have provided an improved basis for its use as a quantitative tool, determining not only the nature of the minerals but also the relative proportions of the different minerals present. The mineralogy of a series of sandstone samples from the Sydney and Bowen Basins of eastern Australia has been evaluated by X-ray diffraction (XRD) on a quantitative basis using the Australian-developed SIROQUANT data processing technique. Based on Rietveld principles, this technique generates a synthetic X-ray diffractogram by adjusting and combining full-profile patterns of minerals nominated as being present in the sample and interactively matches the synthetic diffractogram under operator instructions to the observed diffractogram of the sample being analysed. The individual mineral patterns may be refined in the process, to allow for variations in crystal structure of individual components or for factors such as preferred orientation in the sample mount. The resulting output provides mass percentages of the different minerals in the mixture, and an estimate of the error associated with each individual percentage determination. The chemical composition of the mineral mixtures indicated by SIROQUANT for each individual sandstone studied was estimated using a spreadsheet routine, and the indicated proportion of each oxide in each sample compared to the actual chemical analysis of the same sandstone as determined independently by X-ray fluorescence spectrometry. The results show a high level of agreement for all major chemical constituents, indicating consistency between the SIROQUANT XRD data and the whole-rock chemical composition. Supplementary testing with a synthetic corundum spike further

  5. Structural parameter identifiability analysis for dynamic reaction networks

    DEFF Research Database (Denmark)

    Davidescu, Florin Paul; Jørgensen, Sten Bay

    2008-01-01

    method based on Lie derivatives. The proposed systematic two phase methodology is illustrated on a mass action based model for an enzymatically catalyzed reaction pathway network where only a limited set of variables is measured. The methodology clearly pinpoints the structurally identifiable parameters...... where for a given set of measured variables it is desirable to investigate which parameters may be estimated prior to spending computational effort on the actual estimation. This contribution addresses the structural parameter identifiability problem for the typical case of reaction network models....... The proposed analysis is performed in two phases. The first phase determines the structurally identifiable reaction rates based on reaction network stoichiometry. The second phase assesses the structural parameter identifiability of the specific kinetic rate expressions using a generating series expansion...

  6. Nucelar reactor seismic safety analysis techniques

    International Nuclear Information System (INIS)

    Cummings, G.E.; Wells, J.E.; Lewis, L.C.

    1979-04-01

    In order to provide insights into the seismic safety requirements for nuclear power plants, a probabilistic based systems model and computational procedure have been developed. This model and computational procedure will be used to identify where data and modeling uncertainties need to be decreased by studying the effect of these uncertainties on the probability of radioactive release and the probability of failure of various structures, systems, and components. From the estimates of failure and release probabilities and their uncertainties the most sensitive steps in the seismic methodologies can be identified. In addition, the procedure will measure the uncertainty due to random occurrences, e.g. seismic event probabilities, material property variability, etc. The paper discusses the elements of this systems model and computational procedure, the event-tree/fault-tree development, and the statistical techniques to be employed

  7. Nonlinear analysis techniques of block masonry walls in nuclear power plants

    International Nuclear Information System (INIS)

    Hamid, A.A.; Harris, H.G.

    1986-01-01

    Concrete masonry walls have been used extensively in nuclear power plants as non-load bearing partitions serving as pipe supports, fire walls, radiation shielding barriers, and similar heavy construction separations. When subjected to earthquake loads, these walls should maintain their structural integrity. However, some of the walls do not meet design requirements based on working stress allowables. Consequently, utilities have used non-linear analysis techniques, such as the arching theory and the energy balance technique, to qualify such walls. This paper presents a critical review of the applicability of non-linear analysis techniques for both unreinforced and reinforced block masonry walls under seismic loading. These techniques are critically assessed in light of the performance of walls from limited available test data. It is concluded that additional test data are needed to justify the use of nonlinear analysis techniques to qualify block walls in nuclear power plants. (orig.)

  8. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  9. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  10. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  11. Task Oriented Evaluation of Module Extraction Techniques

    Science.gov (United States)

    Palmisano, Ignazio; Tamma, Valentina; Payne, Terry; Doran, Paul

    Ontology Modularization techniques identify coherent and often reusable regions within an ontology. The ability to identify such modules, thus potentially reducing the size or complexity of an ontology for a given task or set of concepts is increasingly important in the Semantic Web as domain ontologies increase in terms of size, complexity and expressivity. To date, many techniques have been developed, but evaluation of the results of these techniques is sketchy and somewhat ad hoc. Theoretical properties of modularization algorithms have only been studied in a small number of cases. This paper presents an empirical analysis of a number of modularization techniques, and the modules they identify over a number of diverse ontologies, by utilizing objective, task-oriented measures to evaluate the fitness of the modules for a number of statistical classification problems.

  12. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  14. Techniques for the identification of corrosion products

    International Nuclear Information System (INIS)

    Ramanathan, L.V.

    1988-12-01

    This paper presents the different techniques that can be used to identify corrosion/oxidation products through determination of either their composition or their structure, chemical analysis and spectrochemical analysis are commonly used to determine the composition of gross corrosion products. Surface anaLysis techniques such as electron microprobe, AES, ESCA, SIMS, ISS, neutron activation analysis, etc., can be used not only to detect the concentration of the various elements present, but also to obtain the concentration profiles of these elements through the corrosion products. The structure of corrosion products is normally determined with the aid of either X-ray or electron diffraction techniques. This paper describes the basic principles, typical characteristics, limitations and the types of information that can be obtained from each of the techniques along with some typical examples. (author) [pt

  15. Application of decision tree technique to sensitivity analysis for results of radionuclide migration calculations. Research documents

    International Nuclear Information System (INIS)

    Nakajima, Kunihiko; Makino, Hitoshi

    2005-03-01

    Uncertainties are always present in the parameters used for the nuclide migration analysis in the geological disposal system. These uncertainties affect the result of such analyses, e.g., and the identification of dominant nuclides. It is very important to identify the parameters causing the significant impact on the results, and to investigate the influence of identified parameters in order to recognize R and D items with respect to the development of geological disposal system and understanding of the system performance. In our study, the decision tree analysis technique was examined in the sensitivity analysis as a method for investigation of the influences of the parameters and for complement existing sensitivity analysis. As a result, results obtained from Monte Carlo simulation with parameter uncertainties could be distinguished with not only important parameters but also with their quantitative conditions (e.g., ranges of parameter values). Furthermore, information obtained from the decision tree analysis could be used 1) to categorize the results obtained from the nuclide migration analysis for a given parameter set, 2) to show prospective effect of reduction to parameter uncertainties on the results. (author)

  16. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  17. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    International Nuclear Information System (INIS)

    Tang, A; Samost, A; Viswanathan, A; Cormack, R; Damato, A

    2015-01-01

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios were then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in

  18. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    Energy Technology Data Exchange (ETDEWEB)

    Tang, A; Samost, A [Massachusetts Institute of Technology, Cambridge, Massachusetts (United States); Viswanathan, A; Cormack, R; Damato, A [Dana-Farber Cancer Institute - Brigham and Women’s Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios were then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in

  19. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  20. Quantitative Analysis of Micro-porosity of Eco-material by Using SEM Technique

    Institute of Scientific and Technical Information of China (English)

    ZHANG Ji-ru; LIU Yuan-zhi; LIU Zu-de

    2004-01-01

    Microstructure of the eco-material combining vegetation recovery with slope protection is important for determining plant-growing properties.Several techniques for analyzing the eco-material microstructure are presented,including the freeze-cut-drying method of preparing samples for scanning electronic microscopy (SEM),the SEM image processing technique and quantifying analysis method of the SEM images,and etc.The aggregates and pores in SEM images are identified using the different mathematics operators,and their effects are compared.The areas of aggregates and pores are obtained using the operator of morphology,and the influences of different thresholds in image segmentation are also discussed.The results show that the method,in which the variation of non-maximum grey-level gradient is limited,improves the effect of edge detections due to a weak distinction existing at the edge between the aggregates and pores in image.The determination of the threshold should combine the image characteristic with filling operation,so as to assure the precision of the image analysis,in which the contact-segmentation is the simplest and most effective method.The results also show that the pore areas in eco-materials are generally larger than those in the correlative soils,and their increment is large as soil fabric being fine.These differences are related to admixture of expansive perliticThe operator of morphology provides a new method for the image analysis of other porous material microstructure such as soils and concretes.

  1. Analysis technique for controlling system wavefront error with active/adaptive optics

    Science.gov (United States)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  2. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  3. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    Fassbender, L.; Gregory, R.; Winterfeldt, D. von; John, R.

    1992-01-01

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  4. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  5. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  6. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  7. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  8. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  9. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  10. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  11. Technique of sample preparation for analysis of gasoline and lubricating oils by X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Avila P, P.

    1990-03-01

    The X-ray fluorescence laboratory of the National Institute of Nuclear Research when not having a technique for the analysis of oils it has intended, with this work, to develop a preparation technique for the analysis of the metals of Pb, Cr, Ni, V and Mo in gasolines and oils, by means of the spectrometry by X-ray fluorescence analysis. The obtained results, its will be of great utility for the one mentioned laboratory. (Author)

  12. In-cylinder pressure-based direct techniques and time frequency analysis for combustion diagnostics in IC engines

    International Nuclear Information System (INIS)

    D’Ambrosio, S.; Ferrari, A.; Galleani, L.

    2015-01-01

    Highlights: • Direct pressure-based techniques have been applied successfully to spark-ignition engines. • The burned mass fraction of pressure-based techniques has been compared with that of 2- and 3-zone combustion models. • The time frequency analysis has been employed to simulate complex diesel combustion events. - Abstract: In-cylinder pressure measurement and analysis has historically been a key tool for off-line combustion diagnosis in internal combustion engines, but online applications for real-time condition monitoring and combustion management have recently become popular. The present investigation presents and compares different low computing-cost in-cylinder pressure based methods for the analyses of the main features of combustion, that is, the start of combustion, the end of combustion and the crankshaft angle that responds to half of the overall burned mass. The instantaneous pressure in the combustion chamber has been used as an input datum for the described analytical procedures and it has been measured by means of a standard piezoelectric transducer. Traditional pressure-based techniques have been shown to be able to predict the burned mass fraction time history more accurately in spark ignition engines than in diesel engines. The most suitable pressure-based techniques for both spark ignition and compression ignition engines have been chosen on the basis of the available experimental data. Time–frequency analysis has also been applied to the analysis of diesel combustion, which is richer in events than spark ignited combustion. Time frequency algorithms for the calculation of the mean instantaneous frequency are computationally efficient, allow the main events of the diesel combustion to be identified and provide the greatest benefits in the presence of multiple injection events. These algorithms can be optimized and applied to onboard diagnostics tools designed for real control, but can also be used as an advanced validation tool for

  13. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pannek, Kerstin [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, School of Medicine, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); Guzzetta, Andrea [IRCCS Stella Maris, Department of Developmental Neuroscience, Calambrone Pisa (Italy); Colditz, Paul B. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Perinatal Research Centre, Brisbane (Australia); Rose, Stephen E. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); University of Queensland Centre for Clinical Research, Royal Brisbane and Women' s Hospital, Brisbane (Australia)

    2012-10-15

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. (orig.)

  14. Visualization techniques for malware behavior analysis

    Science.gov (United States)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  15. Protein identification and quantification from riverbank grape, Vitis riparia: Comparing SDS-PAGE and FASP-GPF techniques for shotgun proteomic analysis.

    Science.gov (United States)

    George, Iniga S; Fennell, Anne Y; Haynes, Paul A

    2015-09-01

    Protein sample preparation optimisation is critical for establishing reproducible high throughput proteomic analysis. In this study, two different fractionation sample preparation techniques (in-gel digestion and in-solution digestion) for shotgun proteomics were used to quantitatively compare proteins identified in Vitis riparia leaf samples. The total number of proteins and peptides identified were compared between filter aided sample preparation (FASP) coupled with gas phase fractionation (GPF) and SDS-PAGE methods. There was a 24% increase in the total number of reproducibly identified proteins when FASP-GPF was used. FASP-GPF is more reproducible, less expensive and a better method than SDS-PAGE for shotgun proteomics of grapevine samples as it significantly increases protein identification across biological replicates. Total peptide and protein information from the two fractionation techniques is available in PRIDE with the identifier PXD001399 (http://proteomecentral.proteomexchange.org/dataset/PXD001399). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The use of multi-criteria decision analysis weight elicitation techniques in patients with mild cognitive impairment: a pilot study.

    Science.gov (United States)

    van Til, Janine A; Dolan, James G; Stiggelbout, Anne M; Groothuis, Karin C G M; Ijzerman, Maarten J

    2008-04-01

    To test the applicability of multi-criteria decision analysis preference elicitation techniques in cognitively impaired individuals. A convenience sample of 16 cognitively impaired subjects and 12 healthy controls was asked to participate in a small pilot study. The subjects determined the relative importance of four decision criteria using five different weight elicitation techniques, namely simple multi-attribute rating technique, simple multi-attribute rating technique using swing weights, Kepner-Tregoe weighting, the analytical hierarchical process, and conjoint analysis. Conjoint analysis was judged to be the easiest method for weight elicitation in the control group (Z = 10.00; p = 0.04), while no significant differences in difficulty rating between methods was found in cognitively impaired subjects. Conjoint analysis elicitates weights and rankings significantly different from other methods. Subjectively, cognitively impaired subjects were positive about the use of the weight elicitation techniques. However, it seems the use of swing weights can result in the employment of shortcut strategies. The results of this pilot study suggest that individuals with mild cognitive impairment are willing and able to use multi-criteria elicitation methods to determine criteria weights in a decision context, although no preference for a method was found. The same methodologic and practical issues can be identified in cognitively impaired individuals as in healthy controls and the choice of method is mostly determined by the decision context.

  17. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  18. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  19. Perceived Effectiveness of Identified Methods and Techniques Teachers Adopt in Prose Literature Lessons in some Secondary Schools in Owerri

    Directory of Open Access Journals (Sweden)

    F. O. Ezeokoli

    2016-07-01

    Full Text Available The study determined the methods adopted by teachers in prose literature-in-English classrooms, activities of teachers and students, teachers’ perceived effectiveness of techniques used. It also examined the objectives of teaching prose literature that teachers should address and the extent teachers believe in student-identified difficulties of studying prose literature. The study adopted the descriptive survey research design. Purposive sampling technique was used to select 85 schools in Owerri metropolis and in each school, all literature teachers of senior secondary I and II were involved. In all, 246 literature teachers participated out of which 15 were purposively selected for observation. The two instruments were: Teachers’ Questionnaire (r = 0.87 and Classroom Observation Schedule (r = 0.73. Data were analysed using frequency counts and percentages. Results revealed that teachers adopted lecture (28.4%, reading (10.9% and discussion (7.3% methods. Teacher’s activities during the lesson include: giving background information, summarizing, dictating notes, reading aloud and explaining and asking questions. The adopted techniques include: questioning, oral reading, silent reading and discussion. Teachers’ perceived questioning as the most effective technique followed by debating and summarizing. Teachers identified development of students’ critical faculties and analytical skills, literary appreciation and language skills to be of utmost concern. It was concluded that the methods adopted by teachers are not diverse enough to cater for the needs and backgrounds of students. Keywords: Methods, Techniques, Perceived Effectiveness, Objectives, Literature-in-English

  20. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  1. A New Technique to Identify Arbitrarily Shaped Noise Sources

    Directory of Open Access Journals (Sweden)

    Roberto A. Tenenbaum

    2006-01-01

    Full Text Available Acoustic intensity is one of the available tools for evaluating sound radiation from vibrating bodies. Active intensity may, in some situations, not give a faithful insight about how much energy is in fact carried into the far field. It was then proposed a new parameter, the supersonic acoustic intensity, which takes into account only the intensity generated by components having a smaller wavenumber than the acoustic one. However, the method is only efective for simple sources, such as plane plates, cylinders and spheres. This work presents a new technique, based on the Boundary Elements Method and the Singular Value Decomposition, to compute the supersonic acoustic intensity for arbitrarily shaped sources. The technique is based in the Kirchoff-Helmholtz equation in a discretized approach, leading to a radiation operator that relates the normal velocity on the source's surface mesh with the pressure at grid points located in the field. Then, the singular value decomposition technique is set to the radiation operator and a cutoff criterion is applied to remove non propagating components. Some numerical examples are presented.

  2. Reactor vital equipment determination techniques

    International Nuclear Information System (INIS)

    Bott, T.F.; Thomas, W.S.

    1983-01-01

    The Reactor Vital Equipment Determination Techniques program at the Los Alamos National Laboratory is discussed. The purpose of the program is to provide the Nuclear Regulatory Commission (NRC) with technical support in identifying vital areas at nuclear power plants using a fault-tree technique. A reexamination of some system modeling assumptions is being performed for the Vital Area Analysis Program. A short description of the vital area analysis and supporting research on modeling assumptions is presented. Perceptions of program modifications based on the research are outlined, and the status of high-priority research topics is discussed

  3. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  4. A New ABCD Technique to Analyze Business Models & Concepts

    OpenAIRE

    Aithal P. S.; Shailasri V. T.; Suresh Kumar P. M.

    2015-01-01

    Various techniques are used to analyze individual characteristics or organizational effectiveness like SWOT analysis, SWOC analysis, PEST analysis etc. These techniques provide an easy and systematic way of identifying various issues affecting a system and provides an opportunity for further development. Whereas these provide a broad-based assessment of individual institutions and systems, it suffers limitations while applying to business context. The success of any business model depends on ...

  5. Extending existing structural identifiability analysis methods to mixed-effects models.

    Science.gov (United States)

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2018-01-01

    The concept of structural identifiability for state-space models is expanded to cover mixed-effects state-space models. Two methods applicable for the analytical study of the structural identifiability of mixed-effects models are presented. The two methods are based on previously established techniques for non-mixed-effects models; namely the Taylor series expansion and the input-output form approach. By generating an exhaustive summary, and by assuming an infinite number of subjects, functions of random variables can be derived which in turn determine the distribution of the system's observation function(s). By considering the uniqueness of the analytical statistical moments of the derived functions of the random variables, the structural identifiability of the corresponding mixed-effects model can be determined. The two methods are applied to a set of examples of mixed-effects models to illustrate how they work in practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Use of Atomic and Nuclear Techniques in Elemental and Isotopic Analysis

    International Nuclear Information System (INIS)

    2008-01-01

    This book is divided into four chapters which were presented by six authors of the best Arab specialists who have used the atomic and nuclear techniques for a long time and recognized their importance and capabilities in scientific researches. Atomic and Nuclear techniques are very successful in the field of analysis because they are the only way to proceed the analysis process with the requested accuracy and they are the cheapest at the same time. A number of these techniques were collected in this book on the basis of their accuracy and the abundance of using them in the analysis of material components, specially when these elements exist with insignificant percentage as in the case of poisons science, archaeology, nutrition, medicine and other applications.

  7. Topology based data analysis identifies a subgroup of breast cancers with a unique mutational profile and excellent survival.

    Science.gov (United States)

    Nicolau, Monica; Levine, Arnold J; Carlsson, Gunnar

    2011-04-26

    High-throughput biological data, whether generated as sequencing, transcriptional microarrays, proteomic, or other means, continues to require analytic methods that address its high dimensional aspects. Because the computational part of data analysis ultimately identifies shape characteristics in the organization of data sets, the mathematics of shape recognition in high dimensions continues to be a crucial part of data analysis. This article introduces a method that extracts information from high-throughput microarray data and, by using topology, provides greater depth of information than current analytic techniques. The method, termed Progression Analysis of Disease (PAD), first identifies robust aspects of cluster analysis, then goes deeper to find a multitude of biologically meaningful shape characteristics in these data. Additionally, because PAD incorporates a visualization tool, it provides a simple picture or graph that can be used to further explore these data. Although PAD can be applied to a wide range of high-throughput data types, it is used here as an example to analyze breast cancer transcriptional data. This identified a unique subgroup of Estrogen Receptor-positive (ER(+)) breast cancers that express high levels of c-MYB and low levels of innate inflammatory genes. These patients exhibit 100% survival and no metastasis. No supervised step beyond distinction between tumor and healthy patients was used to identify this subtype. The group has a clear and distinct, statistically significant molecular signature, it highlights coherent biology but is invisible to cluster methods, and does not fit into the accepted classification of Luminal A/B, Normal-like subtypes of ER(+) breast cancers. We denote the group as c-MYB(+) breast cancer.

  8. Classification Technique for Ultrasonic Weld Inspection Signals using a Neural Network based on 2-dimensional fourier Transform and Principle Component Analysis

    International Nuclear Information System (INIS)

    Kim, Jae Joon

    2004-01-01

    Neural network-based signal classification systems are increasingly used in the analysis of large volumes of data obtained in NDE applications. Ultrasonic inspection methods on the other hand are commonly used in the nondestructive evaluation of welds to detect flaws. An important characteristic of ultrasonic inspection is the ability to identify the type of discontinuity that gives rise to a peculiar signal. Standard techniques rely on differences in individual A-scans to classify the signals. This paper proposes an ultrasonic signal classification technique based on the information tying in the neighboring signals. The approach is based on a 2-dimensional Fourier transform and the principal component analysis to generate a reduced dimensional feature vector for classification. Results of applying the technique to data obtained from the inspection of actual steel welds are presented

  9. Condition monitoring and signature analysis techniques as applied to Madras Atomic Power Station (MAPS) [Paper No.: VIA - 1

    International Nuclear Information System (INIS)

    Rangarajan, V.; Suryanarayana, L.

    1981-01-01

    The technique of vibration signature analysis for identifying the machine troubles in their early stages is explained. The advantage is that a timely corrective action can be planned to avoid breakdowns and unplanned shutdowns. At the Madras Atomic Power Station (MAPS), this technique is applied to regularly monitor vibrations of equipment and thus is serving as a tool for doing corrective maintenance of equipment. Case studies of application of this technique to main boiler feed pumps, moderation pump motors, centrifugal chiller, ventilation system fans, thermal shield ventilation fans, filtered water pumps, emergency process sea water pumps, and antifriction bearings of MAPS are presented. Condition monitoring during commissioning and subsequent operation could indicate defects. Corrective actions which were taken are described. (M.G.B.)

  10. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  11. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  12. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  13. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  14. Assessment of trabecular bone changes around endosseous implants using image analysis techniques: A preliminary study

    International Nuclear Information System (INIS)

    Zuki, Mervet El; Omami, Galal; Horner, Keith

    2014-01-01

    The objective of this study was to assess the trabecular bone changes that occurred around functional endosseous dental implants by means of radiographic image analysis techniques. Immediate preoperative and postoperative periapical radiographs of de-identified implant patients at the University Dental Hospital of Manchester were retrieved, screened for specific inclusion criteria, digitized, and quantified for structural elements of the trabecular bone around the endosseous implants, by using image analysis techniques. Data were analyzed using SPSS version 11.5. P values of less than 0.05 were considered statistically significant. A total of 12 implants from 11 patients were selected for the study, and 26 regions of interest were obtained. There was a significant increase in the bone area in terms of the mean distance between nodes (p=0.006) and a significant decrease in the marrow area in terms of the bone area (p=0.006) and the length of marrow spaces (p=0.032). It appeared that the bone around the implant underwent remodeling that resulted in a net increase in bone after implant placement.

  15. Assessment of trabecular bone changes around endosseous implants using image analysis techniques: A preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Zuki, Mervet El [Dept. of Oral Medicine and Radiology, Benghazi University College of Dentistry, Benghazi (Libya); Omami, Galal [Oral Diagnosis and Polyclinics, Faculty of Dentistry, The University of Hong Kong (Hong Kong); Horner, Keith [Dept. of Oral Radiology, University Dental Hospital of Manchester, Manchester (United Kingdom)

    2014-06-15

    The objective of this study was to assess the trabecular bone changes that occurred around functional endosseous dental implants by means of radiographic image analysis techniques. Immediate preoperative and postoperative periapical radiographs of de-identified implant patients at the University Dental Hospital of Manchester were retrieved, screened for specific inclusion criteria, digitized, and quantified for structural elements of the trabecular bone around the endosseous implants, by using image analysis techniques. Data were analyzed using SPSS version 11.5. P values of less than 0.05 were considered statistically significant. A total of 12 implants from 11 patients were selected for the study, and 26 regions of interest were obtained. There was a significant increase in the bone area in terms of the mean distance between nodes (p=0.006) and a significant decrease in the marrow area in terms of the bone area (p=0.006) and the length of marrow spaces (p=0.032). It appeared that the bone around the implant underwent remodeling that resulted in a net increase in bone after implant placement.

  16. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  17. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  18. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  19. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  20. Nominal group technique: a brainstorming tool for identifying areas to improve pain management in hospitalized patients.

    Science.gov (United States)

    Peña, Adolfo; Estrada, Carlos A; Soniat, Debbie; Taylor, Benjamin; Burton, Michael

    2012-01-01

    Pain management in hospitalized patients remains a priority area for improvement; effective strategies for consensus development are needed to prioritize interventions. To identify challenges, barriers, and perspectives of healthcare providers in managing pain among hospitalized patients. Qualitative and quantitative group consensus using a brainstorming technique for quality improvement-the nominal group technique (NGT). One medical, 1 medical-surgical, and 1 surgical hospital unit at a large academic medical center. Nurses, resident physicians, patient care technicians, and unit clerks. Responses and ranking to the NGT question: "What causes uncontrolled pain in your unit?" Twenty-seven health workers generated a total of 94 ideas. The ideas perceived contributing to a suboptimal pain control were grouped as system factors (timeliness, n = 18 ideas; communication, n = 11; pain assessment, n = 8), human factors (knowledge and experience, n = 16; provider bias, n = 8; patient factors, n = 19), and interface of system and human factors (standardization, n = 14). Knowledge, timeliness, provider bias, and patient factors were the top ranked themes. Knowledge and timeliness are considered main priorities to improve pain control. NGT is an efficient tool for identifying general and context-specific priority areas for quality improvement; teams of healthcare providers should consider using NGT to address their own challenges and barriers. Copyright © 2011 Society of Hospital Medicine.

  1. Application of energy dispersive x-ray techniques for water analysis

    International Nuclear Information System (INIS)

    Funtua, I. I.

    2000-07-01

    Energy dispersive x-ray fluorescence (EDXRF) is a class of emission spectroscopic techniques that depends upon the emission of characteristic x-rays following excitation of the atomic electron energy levels by tube or isotopic source x-rays. The technique has found wide range of applications that include determination of chemical elements of water and water pollutants. Three EDXRF systems, the isotopic source, secondary target and total reflection (TXRF) are available at the Centre for Energy research and Training. These systems have been applied for the analysis of sediments, suspensions, ground water, river and rainwater. The isotopic source is based on 55 Fe, 109 Cd and 241 Am excitations while the secondary target and the total reflection are utilizing a Mo x-ray tube. Sample preparation requirements for water analysis range from physical and chemical pre-concentration steps to direct analysis and elements from Al to U can be determined with these systems. The EDXRF techniques, TXRF in particular with its multielement capability, low detection limit and possibility of direct analysis for water have competitive edge over the traditional methods of atomic absorption and flame photometry

  2. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  3. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  4. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    Science.gov (United States)

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available

  5. Identifying functions for ex-core neutron noise analysis

    International Nuclear Information System (INIS)

    Avila, J.M.; Oliveira, J.C.

    1987-01-01

    A method of performing the phase analysis of signals arising from neutron detectors placed in the periphery of a pressurized water reactor is proposed. It consists in the definition of several identifying functions, based on the phases of cross power spectral densities corresponding to four ex-core neutron detectors. Each of these functions enhances the appearance of different sources of noise. The method, applied to the ex-core neutron fluctuation analysis of a French PWR, proved to be very useful as it allows quick recognition of various patterns in the power spectral densities. (orig.) [de

  6. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  7. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  8. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  9. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  10. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  11. Managing Electrochemical Noise Data by Exception Application of an On Line EN Data Analysis Technique to Data From a High Level Nuclear Waste Tank

    International Nuclear Information System (INIS)

    EDGEMON, G.L.

    2003-01-01

    Electrochemical noise has been used a t the Hanford Site for a number of years to monitor in real time for pitting corrosion and stress corrosion cracking (SCC) mechanisms in high level nuclear waste tanks. Currently the monitoring technique has only been implemented on three of the 177 underground storage tanks on the site. Widespread implementation of the technique has been held back for of a number of reasons, including issues around managing the large volume of data associated with electrochemical noise and the complexity of data analysis. Expert review of raw current and potential measurements is the primary form of data analysis currently used at the Hanford site. This paper demonstrates the application of an on-line data filtering and analysis technique that could allow data from field applications of electrochemical noise to be managed by exception, transforming electrochemical noise data into a process parameter and focusing data analysis efforts on the important data. Results of the analysis demonstrate a data compression rate of 95%; that is, only 5% of the data would require expert analysis if such a technique were implemented. It is also demonstrated that this technique is capable of identifying key periods where localized corrosion activity is apparent

  12. Air pollution studies in Tianjing city using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    Ni Bangfa; Tian Weizhi; Nie Nuiling; Wang Pingsheng

    1999-01-01

    Two sites of airborne sampling from industrial and residential areas were made in Tianjing city during February and June using PM-10 sampler and analyzed by NAA techniques; Comparison of air pollution between urban and rural area in Tianjing city was made using neutron activation analysis techniques and some other data analyzing techniques. (author)

  13. Identifying desertification risk areas using fuzzy membership and geospatial technique - A case study, Kota District, Rajasthan

    Science.gov (United States)

    Dasgupta, Arunima; Sastry, K. L. N.; Dhinwa, P. S.; Rathore, V. S.; Nathawat, M. S.

    2013-08-01

    Desertification risk assessment is important in order to take proper measures for its prevention. Present research intends to identify the areas under risk of desertification along with their severity in terms of degradation in natural parameters. An integrated model with fuzzy membership analysis, fuzzy rule-based inference system and geospatial techniques was adopted, including five specific natural parameters namely slope, soil pH, soil depth, soil texture and NDVI. Individual parameters were classified according to their deviation from mean. Membership of each individual values to be in a certain class was derived using the normal probability density function of that class. Thus if a single class of a single parameter is with mean μ and standard deviation σ, the values falling beyond μ + 2 σ and μ - 2 σ are not representing that class, but a transitional zone between two subsequent classes. These are the most important areas in terms of degradation, as they have the lowest probability to be in a certain class, hence highest probability to be extended or narrowed down in next or previous class respectively. Eventually, these are the values which can be easily altered, under extrogenic influences, hence are identified as risk areas. The overall desertification risk is derived by incorporating the different risk severity of each parameter using fuzzy rule-based interference system in GIS environment. Multicriteria based geo-statistics are applied to locate the areas under different severity of desertification risk. The study revealed that in Kota, various anthropogenic pressures are accelerating land deterioration, coupled with natural erosive forces. Four major sources of desertification in Kota are, namely Gully and Ravine erosion, inappropriate mining practices, growing urbanization and random deforestation.

  14. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  15. Structural identifiability of systems biology models: a critical comparison of methods.

    Directory of Open Access Journals (Sweden)

    Oana-Teodora Chis

    Full Text Available Analysing the properties of a biological system through in silico experimentation requires a satisfactory mathematical representation of the system including accurate values of the model parameters. Fortunately, modern experimental techniques allow obtaining time-series data of appropriate quality which may then be used to estimate unknown parameters. However, in many cases, a subset of those parameters may not be uniquely estimated, independently of the experimental data available or the numerical techniques used for estimation. This lack of identifiability is related to the structure of the model, i.e. the system dynamics plus the observation function. Despite the interest in knowing a priori whether there is any chance of uniquely estimating all model unknown parameters, the structural identifiability analysis for general non-linear dynamic models is still an open question. There is no method amenable to every model, thus at some point we have to face the selection of one of the possibilities. This work presents a critical comparison of the currently available techniques. To this end, we perform the structural identifiability analysis of a collection of biological models. The results reveal that the generating series approach, in combination with identifiability tableaus, offers the most advantageous compromise among range of applicability, computational complexity and information provided.

  16. Colour and shape analysis techniques for weed detection in cereal fields

    DEFF Research Database (Denmark)

    Pérez, A.J; López, F; Benlloch, J.V.

    2000-01-01

    . The proposed methods use colour information to discriminate between vegetation and background, whilst shape analysis techniques are applied to distinguish between crop and weeds. The determination of crop row position helps to reduce the number of objects to which shape analysis techniques are applied....... The performance of algorithms was assessed by comparing the results with a human classification, providing an acceptable success rate. The study has shown that despite the difficulties in accurately determining the number of seedlings (as in visual surveys), it is feasible to use image processing techniques......Information on weed distribution within the field is necessary to implement spatially variable herbicide application. This paper deals with the development of near-ground image capture and processing techniques in order to detect broad-leaved weeds in cereal crops under actual field conditions...

  17. Identifying and prioritizing the tools/techniques of knowledge management based on the Asian Productivity Organization Model (APO) to use in hospitals.

    Science.gov (United States)

    Khajouei, Hamid; Khajouei, Reza

    2017-12-01

    Appropriate knowledge, correct information, and relevant data are vital in medical diagnosis and treatment systems. Knowledge Management (KM) through its tools/techniques provides a pertinent framework for decision-making in healthcare systems. The objective of this study was to identify and prioritize the KM tools/techniques that apply to hospital setting. This is a descriptive-survey study. Data were collected using a -researcher-made questionnaire that was developed based on experts' opinions to select the appropriate tools/techniques from 26 tools/techniques of the Asian Productivity Organization (APO) model. Questions were categorized into five steps of KM (identifying, creating, storing, sharing, and applying the knowledge) according to this model. The study population consisted of middle and senior managers of hospitals and managing directors of Vice-Chancellor for Curative Affairs in Kerman University of Medical Sciences in Kerman, Iran. The data were analyzed in SPSS v.19 using one-sample t-test. Twelve out of 26 tools/techniques of the APO model were identified as the tools applicable in hospitals. "Knowledge café" and "APO knowledge management assessment tool" with respective means of 4.23 and 3.7 were the most and the least applicable tools in the knowledge identification step. "Mentor-mentee scheme", as well as "voice and Voice over Internet Protocol (VOIP)" with respective means of 4.20 and 3.52 were the most and the least applicable tools/techniques in the knowledge creation step. "Knowledge café" and "voice and VOIP" with respective means of 3.85 and 3.42 were the most and the least applicable tools/techniques in the knowledge storage step. "Peer assist and 'voice and VOIP' with respective means of 4.14 and 3.38 were the most and the least applicable tools/techniques in the knowledge sharing step. Finally, "knowledge worker competency plan" and "knowledge portal" with respective means of 4.38 and 3.85 were the most and the least applicable tools/techniques

  18. Techniques of lumbar-sacral spine fusion in spondylosis: systematic literature review and meta-analysis of randomized clinical trials.

    Science.gov (United States)

    Umeta, Ricardo S G; Avanzi, Osmar

    2011-07-01

    Spine fusions can be performed through different techniques and are used to treat a number of vertebral pathologies. However, there seems to be no consensus regarding which technique of fusion is best suited to treat each distinct spinal disease or group of diseases. To study the effectiveness and complications of the different techniques used for spinal fusion in patients with lumbar spondylosis. Systematic literature review and meta-analysis. Randomized clinical studies comparing the most commonly performed surgical techniques for spine fusion in lumbar-sacral spondylosis, as well as those reporting patient outcome were selected. Identify which technique, if any, presents the best clinical, functional, and radiographic outcome. Systematic literature review and meta-analysis based on scientific articles published and indexed to the following databases: PubMed (1966-2009), Cochrane Collaboration-CENTRAL, EMBASE (1980-2009), and LILACS (1982-2009). The general search strategy focused on the surgical treatment of patients with lumbar-sacral spondylosis. Eight studies met the inclusion criteria and were selected with a total of 1,136 patients. Meta-analysis showed that patients who underwent interbody fusion presented a significantly smaller blood loss (p=.001) and a greater rate of bone fusion (p=.02). Patients submitted to fusion using the posterolateral approach had a significantly shorter operative time (p=.007) and less perioperative complications (p=.03). No statistically significant difference was found for the other studied variables (pain, functional impairment, and return to work). The most commonly used techniques for lumbar spine fusion in patients with spondylosis were interbody fusion and posterolateral approach. Both techniques were comparable in final outcome, but the former presented better rates of fusion and the latter the less complications. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  20. Human factors analysis and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized event-tree analysis technique

    International Nuclear Information System (INIS)

    Casey, S.M.; Deretsky, Z.

    1980-08-01

    This document provides detailed instructions for using the Computerized Event-Tree Analysis Technique (CETAT), a program designed to assist a human factors analyst in predicting event probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT analysis, (b) develop operator performance data, (c) enter an event-tree structure, (d) modify a data base, and (e) analyze event paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program

  1. Elemental analysis of some Egyptian medicinal plants using INAA and FAAS techniques

    International Nuclear Information System (INIS)

    Walley El-Dine, N.; Sroor, A.; Hammed, S.S.; El-Shershaby, A.; Alsamed, M.A

    2009-01-01

    Thirteen Egyptian medicinal plants used for the treatment and cure of various diseases have been elementally analyzed by instrumental neutron activation analysis (INAA) and flame atomic absorption spectrometry (FAAS). The pneumatic rabbit transfer system (PRTS)of 100 kw Budapest research reactor (BRR) was used , for short time irradiation, 300 s, with a thermal neutron flux of 2.4 * 10 12 n/(cm 2 .s). Long time irradiation, 4 hours, was performed at the second research Egyptian reactor (Et-Rr-2) with thermal neutron flux of 5.6 * 10 13 n/(cm 2 .s).Gamma ray spectra were measured by a HPGe detection system . The concentrations of fifteen elements namely Sc,Cr,Fe,Co ,Zn,Rb ,Mo,Sb,La,Ce,Nd, Sm, Yb, Hf and Pa have been determined by long irradiation time and some of them were determined also by FAAS technique. Fourteen elements Na,Mg,Al ,Cd,Cl,K,Ca,Ti,V,Mn ,Ni, Sr,Pb,and Cu, have been identified by short irradiation time and FAAS technique. The precision and accuracy of the method were evaluated using the standard reference material NIST SRM-1571. Comparison of the data obtained give agreement between the concentration of elements determined by the two techniques. The importance of these elements related to human health and nutrition has been discussed

  2. Nuclear techniques for on-line analysis in the mineral and energy industries

    International Nuclear Information System (INIS)

    Sowerby, B.D.; Watt, J.S.

    1994-01-01

    Nuclear techniques are the basis of many on-line analysis systems which are now widely used in the mineral and energy industries. Some of the systems developed by the CSIRO depend entirely on nuclear techniques; others use a combination of nuclear techniques and microwave, capacitance, or ultrasonic techniques. The continuous analysis and rapid response of these CSIRO systems has led to improved control of mining, processing and blending operations, with increased productivity valued at A$50 million per year to Australia, and $90 million per year world wide. This paper reviews developments in nuclear on-line analysis systems by the On-Line Analysis Group in CSIRO at Lucas Heights. Commercialised systems based on this work analyse mineral and coal slurries and determine the ash and moisture contents of coal and coke on conveyors. This paper also reviews two on-line nuclear analysis systems recently developed and licensed to industry, firstly for the determination of the mass flow rates of oil/water/gas mixtures in pipelines, and secondly for determination of the moisture, specific energy, ash and fouling index in low rank coals. 8 refs., 3 tabs., 4 figs

  3. Quality-assurance techniques used with automated analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Killian, E.W.; Koeppen, L.D.; Femec, D.A.

    1994-01-01

    In the course of developing gamma-ray spectrum analysis algorithms for use by the Radiation Measurements Laboratory at the Idaho National Engineering Laboratory (INEL), several techniques have been developed that enhance and verify the quality of the analytical results. The use of these quality-assurance techniques is critical when gamma-ray analysis results from low-level environmental samples are used in risk assessment or site restoration and cleanup decisions. This paper describes four of the quality-assurance techniques that are in routine use at the laboratory. They are used for all types of samples, from reactor effluents to environmental samples. The techniques include: (1) the use of precision pulsers (with subsequent removal) to validate the correct operation of the spectrometer electronics for each and every spectrum acquired, (2) the use of naturally occurring and cosmically induced radionuclides in samples to help verify that the data acquisition and analysis were performed properly, (3) the use of an ambient background correction technique that involves superimposing (open-quotes mappingclose quotes) sample photopeak fitting parameters onto multiple background spectra for accurate and more consistent quantification of the background activities, (4) the use of interactive, computer-driven graphics to review the automated locating and fitting of photopeaks and to allow for manual fitting of photopeaks

  4. Identifying inaccuracy of MS Project using system analysis

    Science.gov (United States)

    Fachrurrazi; Husin, Saiful; Malahayati, Nurul; Irzaidi

    2018-05-01

    The problem encountered in project owner’s financial accounting report is the difference in total project costs of MS Project to the Indonesian Standard (Standard Indonesia Standard / Cost Estimating Standard Book of Indonesia). It is one of the MS Project problems concerning to its cost accuracy, so cost data cannot be used in an integrated way for all project components. This study focuses on finding the causes of inaccuracy of the MS Projects. The aim of this study, which is operationally, are: (i) identifying cost analysis procedures for both current methods (SNI) and MS Project; (ii) identifying cost bias in each element of the cost analysis procedure; and (iii) analysing the cost differences (cost bias) in each element to identify what the cause of inaccuracies in MS Project toward SNI is. The method in this study is comparing for both the system analysis of MS Project and SNI. The results are: (i) MS Project system in Work of Resources element has limitation for two decimal digits only, have led to its inaccuracy. Where the Work of Resources (referred to as effort) in MS Project represents multiplication between the Quantities of Activities and Requirements of resources in SNI; (ii) MS Project and SNI have differences in the costing methods (the cost estimation methods), in which the SNI uses the Quantity-Based Costing (QBC), meanwhile MS Project uses the Time-Based Costing (TBC). Based on this research, we recommend to the contractors who use SNI should make an adjustment for Work of Resources in MS Project (with correction index) so that it can be used in an integrated way to the project owner’s financial accounting system. Further research will conduct for improvement the MS Project as an integrated tool toward all part of the project participant.

  5. Factors influencing patient compliance with therapeutic regimens in chronic heart failure: A critical incident technique analysis.

    Science.gov (United States)

    Strömberg, A; Broström, A; Dahlström, U; Fridlund, B

    1999-01-01

    The aim of this study was to identify factors influencing compliance with prescribed treatment in patients with chronic heart failure. A qualitative design with a critical incident technique was used. Incidents were collected through interviews with 25 patients with heart failure strategically selected from a primary health care clinic, a medical ward, and a specialist clinic. Two hundred sixty critical incidents were identified in the interviews and 2 main areas emerged in the analysis: inward factors and outward factors. The inward factors described how compliance was influenced by the personality of the patient, the disease, and the treatment. The outward factors described how compliance was influenced by social activities, social relationships, and health care professionals. By identifying the inward and outward factors influencing patients with chronic heart failure, health care professionals can assess whether intervention is needed to increase compliance.

  6. Biokinematic structure of techniques wrestlers during pre-basic training

    OpenAIRE

    S.V. Sinіgovets

    2013-01-01

    The theoretical aspects of freestyle wrestlers. Experimentally investigated the structural elements of techniques during pre-basic training. The study involved 28 young fighters. Held video computer analysis techniques. Identified biomechanical characteristics defined kinematic structure of the temporal and spatial-temporal characteristics of the basic techniques. Shown variability of the individual phases of the basic techniques. Structural dynamics of the resulting velocities of the individ...

  7. Developing and Evaluating the HRM Technique for Identifying Cytochrome P450 2D6 Polymorphisms.

    Science.gov (United States)

    Lu, Hsiu-Chin; Chang, Ya-Sian; Chang, Chun-Chi; Lin, Ching-Hsiung; Chang, Jan-Gowth

    2015-05-01

    Cytochrome P450 2D6 is one of the important enzymes involved in the metabolism of many widely used drugs. Genetic polymorphisms of CYP2D6 can affect its activity. Therefore, an efficient method for identifying CYP2D6 polymorphisms is clinically important. We developed a high-resolution melting (HRM) analysis to investigate CYP2D6 polymorphisms. Genomic DNA was extracted from peripheral blood samples from 71 healthy individuals. All nine exons of the CYP2D6 gene were sequenced before screening by HRM analysis. This method can detect the most genotypes (*1, *2, *4, *10, *14, *21 *39, and *41) of CYP2D6 in Chinese. All samples were successfully genotyped. The four most common mutant CYP2D6 alleles (*1, *2, *10, and *41) can be genotyped. The single nucleotides polymorphism (SNP) frequencies of 100C > T (rs1065852), 1039C > T (rs1081003), 1661G > C (rs1058164), 2663G > A (rs28371722), 2850C > T (rs16947), 2988G > A (rs28371725), 3181A > G, and 4180G > C (rs1135840) were 58%, 61%, 73%, 1%, 13%, 3%, 1%, 73%, respectively. We identified 100% of all heterozygotes without any errors. The two homozygous genotypes (1661G > C and 4180G > C) can be distinguished by mixing with a known genotype sample to generate an artificial heterozygote for HRM analysis. Therefore, all samples could be identified using our HRM method, and the results of HRM analysis are identical to those obtained by sequencing. Our method achieved 100% sensitivity, specificity, positive prediction value and negative prediction value. HRM analysis is a nongel resolution method that is faster and less expensive than direct sequencing. Our study shows that it is an efficient tool for typing CYP2D6 polymorphisms. © 2014 Wiley Periodicals, Inc.

  8. Combinations of techniques that effectively change health behavior : evidence from meta-Cart analysis

    NARCIS (Netherlands)

    Dusseldorp, E.; Buuren, S. van; Genugten, L. van; Verheijden, M.W.; Empelen, P. van

    2014-01-01

    Objective: Many health-promoting interventions combine multiple behavior change techniques (BCTs) to maximize effectiveness. Although, in theory, BCTs can amplify each other, the available meta-analyses have not been able to identify specific combinations of techniques that provide synergistic

  9. Analysis of rocks involving the x-ray diffraction, infrared and thermal gravimetric techniques

    International Nuclear Information System (INIS)

    Ikram, M.; Rauf, M.A.; Munir, N.

    1998-01-01

    Chemical analysis of rocks and minerals are usually obtained by a number of analytical techniques. The purpose of present work is to investigate the chemical composition of the rock samples and also to find that how far the results obtained by different instrumental methods are closely related. Chemical tests wee performed before using the instrumental techniques in order to determined the nature of these rocks. The chemical analysis indicated mainly the presence of carbonate and hence the carbonate nature of these rocks. The x-ray diffraction, infrared spectroscopy and thermal gravimetric analysis techniques were used for the determination of chemical composition of these samples. The results obtained by using these techniques have shown a great deal of similarities. (author)

  10. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  11. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  12. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  13. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  14. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  15. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  16. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  17. Detection of delta-ferrite to sigma transformation using metallographic techniques involving ferromagnetic colloid, color etching, and microprobe analysis

    International Nuclear Information System (INIS)

    Gray, R.J.; Sikka, V.K.; King, R.T.

    1976-01-01

    The mechanical properties of ferrite-containing austenitic stainless steel base metal and weldments are usually adversely affected by prolonged exposure to temperatures in the 482 to 900 0 C (900 to 1652 0 F) range. One cause of the property alteration is related to the transformation of relatively ductile delta-ferrite to less ductile sigma-phase. Attempts to identify sigma and delta-ferrite phases by color staining techniques alone are well documented; however, the results are often questionable due to the difficulty in maintaining consistent color identifications. This investigation is concerned with the microstructural responses of the ferromagnetic delta-ferrite phase and the paramagnetic sigma-phase to a ferromagnetic iron colloid in a magnetic field. Such positive or negative responses of the two phases to the colloid offer a more definitive identification. With this technique, the identification of small amounts of these phases in the microstructure is limited only by the highest magnification and resolution of the optical microscope. The procedure is substantiated in this metallographic study with microprobe analysis and color metallography. Several examples of the correlative use of these three techniques in identifying varying amounts of delta-ferrite → sigma transformation are presented

  18. Duct-to-mucosa versus dunking techniques of pancreaticojejunostomy after pancreaticoduodenectomy: Do we need more trials? A systematic review and meta-analysis with trial sequential analysis.

    Science.gov (United States)

    Kilambi, Ragini; Singh, Anand Narayan

    2018-03-25

    Pancreaticojejunostomy (PJ is the most widely used reconstruction technique after pancreaticoduodenectomy. Despite several randomized trials, the ideal technique of pancreaticojejunostomy remains debatable. We planned a meta-analysis of randomized trials comparing the two most common techniques of PJ (duct-to-mucosa and dunking) to identify the best available evidence in the current literature. We searched the Pubmed/Medline, Web of science, Science citation index, Google scholar and Cochrane Central Register of Controlled Trials electronic databases till October 2017 for all English language randomized trials comparing the two approaches. Statistical analysis was performed using Review Manager (RevMan), Version 5.3. Copenhagen: The Nordic Cochrane Center, The Cochrane Collaboration, 2014 and results were expressed as odds ratio for dichotomous and mean difference for continuous variables. P-value ≤ 0.05 was considered significant. Trial sequential analysis was performed using TSA version 0.9.5.5 (Copenhagen: The Copenhagen Trial Unit, Center for Clinical Intervention Research, 2016). A total of 8 trials were included, with a total of 1043 patients (DTM: 518; Dunking: 525). There was no significant difference between the two groups in terms of overall as well as clinically relevant POPF rate. Similarly, both groups were comparable for the secondary outcomes. Trial sequential analysis revealed that the required information size had been crossed without achieving a clinically significant difference for overall POPF; and though the required information size had not been achieved for CR-POPF, the current data has already crossed the futility line for CR-POPF with a 10% risk difference, 80% power and 5% α error. This meta-analysis found no significant difference between the two techniques in terms of overall and CR-POPF rates. Further, the existing evidence is sufficient to conclude lack of difference and further trials are unlikely to result in any change in the

  19. Technique for finding and identifying filters that cut off OTDR lights in front of ONU from a central office

    Science.gov (United States)

    Takaya, Masaaki; Honda, Hiroyasu; Narita, Yoshihiro; Yamamoto, Fumihiko; Arakawa, Koji

    2006-04-01

    We report on a newly developed in-service measurement technique that can be used from a central office to find and identify any filter in front of an ONU on an optical fiber access network. Using this system, in-service tests can be performed because the test lights are modulated at a high frequency. Moreover, by using the equipment we developed, this confirmation operation can be performed continuously and automatically with existing automatic fiber testing systems. The developed technique is effective for constructing a fiber line testing system with an optical time domain reflectometer.

  20. Identifying influential individuals on intensive care units: using cluster analysis to explore culture.

    Science.gov (United States)

    Fong, Allan; Clark, Lindsey; Cheng, Tianyi; Franklin, Ella; Fernandez, Nicole; Ratwani, Raj; Parker, Sarah Henrickson

    2017-07-01

    The objective of this paper is to identify attribute patterns of influential individuals in intensive care units using unsupervised cluster analysis. Despite the acknowledgement that culture of an organisation is critical to improving patient safety, specific methods to shift culture have not been explicitly identified. A social network analysis survey was conducted and an unsupervised cluster analysis was used. A total of 100 surveys were gathered. Unsupervised cluster analysis was used to group individuals with similar dimensions highlighting three general genres of influencers: well-rounded, knowledge and relational. Culture is created locally by individual influencers. Cluster analysis is an effective way to identify common characteristics among members of an intensive care unit team that are noted as highly influential by their peers. To change culture, identifying and then integrating the influencers in intervention development and dissemination may create more sustainable and effective culture change. Additional studies are ongoing to test the effectiveness of utilising these influencers to disseminate patient safety interventions. This study offers an approach that can be helpful in both identifying and understanding influential team members and may be an important aspect of developing methods to change organisational culture. © 2017 John Wiley & Sons Ltd.

  1. NREL Analysis Identifies Where Commercial Customers Might Benefit from

    Science.gov (United States)

    Battery Energy Storage | NREL | News | NREL NREL Analysis Identifies Where Commercial Customers Customers Might Benefit from Battery Energy Storage August 24, 2017 After upfront costs, batteries may reduce operating costs for customers paying demand charges Commercial electricity customers who are

  2. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H S; Kim, J H; Lee, J C; Choi, Y R; Moon, S S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  3. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    International Nuclear Information System (INIS)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S.

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system

  4. Exome Sequencing and Linkage Analysis Identified Novel Candidate Genes in Recessive Intellectual Disability Associated with Ataxia.

    Science.gov (United States)

    Jazayeri, Roshanak; Hu, Hao; Fattahi, Zohreh; Musante, Luciana; Abedini, Seyedeh Sedigheh; Hosseini, Masoumeh; Wienker, Thomas F; Ropers, Hans Hilger; Najmabadi, Hossein; Kahrizi, Kimia

    2015-10-01

    Intellectual disability (ID) is a neuro-developmental disorder which causes considerable socio-economic problems. Some ID individuals are also affected by ataxia, and the condition includes different mutations affecting several genes. We used whole exome sequencing (WES) in combination with homozygosity mapping (HM) to identify the genetic defects in five consanguineous families among our cohort study, with two affected children with ID and ataxia as major clinical symptoms. We identified three novel candidate genes, RIPPLY1, MRPL10, SNX14, and a new mutation in known gene SURF1. All are autosomal genes, except RIPPLY1, which is located on the X chromosome. Two are housekeeping genes, implicated in transcription and translation regulation and intracellular trafficking, and two encode mitochondrial proteins. The pathogenesis of these variants was evaluated by mutation classification, bioinformatic methods, review of medical and biological relevance, co-segregation studies in the particular family, and a normal population study. Linkage analysis and exome sequencing of a small number of affected family members is a powerful new technique which can be used to decrease the number of candidate genes in heterogenic disorders such as ID, and may even identify the responsible gene(s).

  5. Latent cluster analysis of ALS phenotypes identifies prognostically differing groups.

    Directory of Open Access Journals (Sweden)

    Jeban Ganesalingam

    2009-09-01

    Full Text Available Amyotrophic lateral sclerosis (ALS is a degenerative disease predominantly affecting motor neurons and manifesting as several different phenotypes. Whether these phenotypes correspond to different underlying disease processes is unknown. We used latent cluster analysis to identify groupings of clinical variables in an objective and unbiased way to improve phenotyping for clinical and research purposes.Latent class cluster analysis was applied to a large database consisting of 1467 records of people with ALS, using discrete variables which can be readily determined at the first clinic appointment. The model was tested for clinical relevance by survival analysis of the phenotypic groupings using the Kaplan-Meier method.The best model generated five distinct phenotypic classes that strongly predicted survival (p<0.0001. Eight variables were used for the latent class analysis, but a good estimate of the classification could be obtained using just two variables: site of first symptoms (bulbar or limb and time from symptom onset to diagnosis (p<0.00001.The five phenotypic classes identified using latent cluster analysis can predict prognosis. They could be used to stratify patients recruited into clinical trials and generating more homogeneous disease groups for genetic, proteomic and risk factor research.

  6. An overview of the RCA/IAEA activities in the Australasian region using nuclear analysis techniques for monitoring air pollution

    International Nuclear Information System (INIS)

    Markwitz, Andreas

    2005-01-01

    The International Atomic Energy Agency (IAEA) via the Regional Co-operative Agreement (RCA) has identified air particulate matter pollution as a major transboundary environmental issue in the Australasian region. Sixteen countries in the region spanning from Pakistan to the Philippines and from China to New Zealand are participating in the regional programme RAS/7/013-Improved information of urban air quality management in the RCA region' that started in 1997. New Zealand is the lead-country for this programme in which nuclear analytical techniques, such as particle induced X-ray emission (PIXE), neutron activation analysis (NAA) and X-ray fluorescence spectrometry (XRF) are used to measure key elements in PM 2.5-0 and PM 10-2.5 filters from GENT stacked samplers collected twice weekly. Major sources of air particulate matter pollution are identified using statistical source apportionment techniques. To identify transboundary air particulate matter pollution events, the data is collated in a large database. Additionally, the data is used by end-users of the participating countries in the programme. An overview is presented. (author)

  7. Determination of the archaeological origin of ceramic fragments characterized by neutron activation analysis, by means of the application of multivariable statistical analysis techniques

    International Nuclear Information System (INIS)

    Almazan T, M. G.; Jimenez R, M.; Monroy G, F.; Tenorio, D.; Rodriguez G, N. L.

    2009-01-01

    The elementary composition of archaeological ceramic fragments obtained during the explorations in San Miguel Ixtapan, Mexico State, was determined by the neutron activation analysis technique. The samples irradiation was realized in the research reactor TRIGA Mark III with a neutrons flow of 1·10 13 n·cm -2 ·s -1 . The irradiation time was of 2 hours. Previous to the acquisition of the gamma rays spectrum the samples were allowed to decay from 12 to 14 days. The analyzed elements were: Nd, Ce, Lu, Eu, Yb, Pa(Th), Tb, La, Cr, Hf, Sc, Co, Fe, Cs, Rb. The statistical treatment of the data, consistent in the group analysis and the main components analysis allowed to identify three different origins of the archaeological ceramic, designated as: local, foreign and regional. (Author)

  8. The application of a shift theorem analysis technique to multipoint measurements

    OpenAIRE

    M. E. Dieckmann; M. E. Dieckmann; S. C. Chapman

    1999-01-01

    A Fourier domain technique has been proposed previously which, in principle, quantifies the extent to which multipoint in-situ measurements can identify whether or not an observed structure is time stationary in its rest frame. Once a structure, sampled for example by four spacecraft, is shown to be quasi-stationary in its rest frame, the structure's velocity vector can be determined with respect to the sampling spacecraft. We investigate the properties of this technique, wh...

  9. An analysis of endothelial microparticles as a function of cell surface antibodies and centrifugation techniques.

    Science.gov (United States)

    Venable, Adam S; Williams, Randall R; Haviland, David L; McFarlin, Brian K

    2014-04-01

    Chronic vascular disease is partially characterized by the presence of lesions along the vascular endothelial wall. Current FDA-approved clinical techniques lack the ability to measure very early changes in endothelial cell health. When endothelial cells are damaged, they release endothelial microparticles (EMPs) into circulation. Thus, blood EMP concentration may represent a useful cardiovascular disease biomarker. Despite the potential value of EMPs, current flow cytometry techniques may not consistently distinguish EMPs from other small cell particles. The purpose of this study was to use imaging flow cytometry to modify existing methods of identifying EMPs based on cell-surface receptor expression and visual morphology. Platelet poor plasma (PPP) was isolated using four different techniques, each utilizing a two-step serial centrifugation process. The cell-surface markers used in this study were selected based on those that are commonly reported in the literature. PPP (100μL) was labeled with CD31, CD42a, CD45, CD51, CD66b, and CD144 for 30-min in dark on ice. Based on replicated experiments, EMPs were best identified by cell-surface CD144 expression relative to other commonly reported EMP markers (CD31 & CD51). It is important to note that contaminating LMPs, GMPs, and PMPs were thought to be removed in the preparation of PPP. However, upon analysis of prepared samples staining CD31 against CD51 revealed a double-positive population that was less than 1% EMPs. In contrast, when using CD144 to identify EMPs, ~87% of observed particles were free of contaminating microparticles. Using a counterstain of CD42a, this purity can be improved to over 99%. More research is needed to understand how our improved EMP measurement method can be used in experimental models measuring acute vascular responses or chronic vascular diseases. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Identifying factors affecting optimal management of agricultural water

    Directory of Open Access Journals (Sweden)

    Masoud Samian

    2015-01-01

    In addition to quantitative methodology such as descriptive statistics and factor analysis a qualitative methodology was employed for dynamic simulation among variables through Vensim software. In this study, the factor analysis technique was used through the Kaiser-Meyer-Olkin (KMO and Bartlett tests. From the results, four key elements were identified as factors affecting the optimal management of agricultural water in Hamedan area. These factors were institutional and legal factors, technical and knowledge factors, economic factors and social factors.

  11. Meta-analysis of Drosophila circadian microarray studies identifies a novel set of rhythmically expressed genes.

    Directory of Open Access Journals (Sweden)

    Kevin P Keegan

    2007-11-01

    Full Text Available Five independent groups have reported microarray studies that identify dozens of rhythmically expressed genes in the fruit fly Drosophila melanogaster. Limited overlap among the lists of discovered genes makes it difficult to determine which, if any, exhibit truly rhythmic patterns of expression. We reanalyzed data from all five reports and found two sources for the observed discrepancies, the use of different expression pattern detection algorithms and underlying variation among the datasets. To improve upon the methods originally employed, we developed a new analysis that involves compilation of all existing data, application of identical transformation and standardization procedures followed by ANOVA-based statistical prescreening, and three separate classes of post hoc analysis: cross-correlation to various cycling waveforms, autocorrelation, and a previously described fast Fourier transform-based technique. Permutation-based statistical tests were used to derive significance measures for all post hoc tests. We find application of our method, most significantly the ANOVA prescreening procedure, significantly reduces the false discovery rate relative to that observed among the results of the original five reports while maintaining desirable statistical power. We identify a set of 81 cycling transcripts previously found in one or more of the original reports as well as a novel set of 133 transcripts not found in any of the original studies. We introduce a novel analysis method that compensates for variability observed among the original five Drosophila circadian array reports. Based on the statistical fidelity of our meta-analysis results, and the results of our initial validation experiments (quantitative RT-PCR, we predict many of our newly found genes to be bona fide cyclers, and suggest that they may lead to new insights into the pathways through which clock mechanisms regulate behavioral rhythms.

  12. Flash fluorescence with indocyanine green videoangiography to identify the recipient artery for bypass with distal middle cerebral artery aneurysms: operative technique.

    Science.gov (United States)

    Rodríguez-Hernández, Ana; Lawton, Michael T

    2012-06-01

    Distal middle cerebral artery (MCA) aneurysms frequently have nonsaccular morphology that necessitates trapping and bypass. Bypasses can be difficult because efferent arteries lie deep in the opercular cleft and may not be easily identifiable. We introduce the "flash fluorescence" technique, which uses videoangiography with indocyanine green (ICG) dye to identify an appropriate recipient artery on the cortical surface for the bypass, enabling a more superficial and easier anastomosis. Flash fluorescence requires 3 steps: (1) temporary clip occlusion of the involved afferent artery; (2) videoangiography demonstrating fluorescence in uninvolved arteries on the cortical surface; and (3) removal of the temporary clip with flash fluorescence in the involved efferent arteries on the cortical surface, thereby identifying a recipient. Alternatively, temporary clips can occlude uninvolved arteries, and videoangiography will demonstrate initial fluorescence in efferent arteries during temporary occlusion and flash fluorescence in uninvolved arteries during reperfusion. From a consecutive series of 604 MCA aneurysms treated microsurgically, 22 (3.6%) were distal aneurysms and 11 required a bypass. The flash fluorescence technique was used in 3 patients to select the recipient artery for 2 superficial temporal artery-to-MCA bypasses and 1 MCA-MCA bypass. The correct recipient was selected in all cases. The flash fluorescence technique provides quick, reliable localization of an appropriate recipient artery for bypass when revascularization is needed for a distal MCA aneurysm. This technique eliminates the need for extensive dissection of the efferent artery and enables a superficial recipient site that makes the anastomosis safer, faster, and less demanding.

  13. Domain-restricted mutation analysis to identify novel driver events in human cancer

    Directory of Open Access Journals (Sweden)

    Sanket Desai

    2017-10-01

    Full Text Available Analysis of mutational spectra across various cancer types has given valuable insights into tumorigenesis. Different approaches have been used to identify novel drivers from the set of somatic mutations, including the methods which use sequence conservation, geometric localization and pathway information. Recent computational methods suggest use of protein domain information for analysis and understanding of the functional consequence of non-synonymous mutations. Similarly, evidence suggests recurrence at specific position in proteins is robust indicators of its functional impact. Building on this, we performed a systematic analysis of TCGA exome derived somatic mutations across 6089 PFAM domains and significantly mutated domains were identified using randomization approach. Multiple alignment of individual domain allowed us to prioritize for conserved residues mutated at analogous positions across different proteins in a statistically disciplined manner. In addition to the known frequently mutated genes, this analysis independently identifies low frequency Meprin and TRAF-Homology (MATH domain in Speckle Type BTB/POZ (SPOP protein, in prostate adenocarcinoma. Results from this analysis will help generate hypotheses about the downstream molecular mechanism resulting in cancer phenotypes.

  14. A human reliability analysis (HRA) method for identifying and assessing the error of commission (EOC) from a diagnosis failure

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Yun; Kang, Dae Il

    2005-01-01

    The study deals with a method for systematically identifying and assessing the EOC events that might be caused from a diagnosis failure or misdiagnosis of the expected events in accident scenarios of nuclear power plants. The method for EOC identification and assessment consists of three steps: analysis of the potential for a diagnosis failure (or misdiagnosis), identification of the EOC events from the diagnosis failure, quantitative assessment of the identified EOC events. As a tool for analysing a diagnosis failure, the MisDiagnosis Tree Analysis (MDTA) technique is proposed with the taxonomy of misdiagnosis causes. Also, the guidance on the identification of EOC events and the classification system and data are given for quantitiative assessment. As an applicaton of the proposed method, the EOCs identification and assessment for Younggwang 3 and 4 plants and their impact on the plant risk were performed. As the result, six events or event sequences were considered for diagnosis failures and about 20 new Human Failure Events (HFEs) involving EOCs were identified. According to the assessment of the risk impact of the identified HFEs, they increase the CDF by 11.4 % of the current CDF value, which corresponds to 10.2 % of the new CDF. The small loss of coolant accident (SLOCA) turned out to be a major contributor to the increase of CDF resulting in 9.2 % increaseof the current CDF.

  15. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  16. High Resolution Spatio Temporal Moments Analysis of Solute Migration Captured using Pre-clinical Medical Imaging Techniques

    Science.gov (United States)

    Dogan, M.; Moysey, S. M.; Powell, B. A.; DeVol, T. A.

    2016-12-01

    Advances in medical imaging technologies are continuously expanding the range of applications enabled within the earth sciences. While computed x-ray tomography (CT) scans have traditionally been used for investigating the structure of geologic materials, it is now possible to perform 3D time-lapse imaging of dynamic processes, such as monitoring the infiltration of water into a soil, with sub-millimeter resolution. Likewise, single photon emission computed tomography (SPECT) can provide information on the evolution of solute transport with spatial resolution on the order of a millimeter by tracking the migration of gamma-ray emitting isotopes like 99mTc and 111In. While these imaging techniques are revolutionizing our ability to look within porous media, techniques for the analysis of such rich and large data sets are limited. The spatial and temporal moments of a plume have long been used to provide quantitative measures to describe plume movement in a wide range of settings from the lab to field. Moment analysis can also be used to estimate the hydrologic properties of the porous media. In this research, we investigate the use of moments for analyzing a high resolution 4D SPECT data set collected during a 99mTc transport experiment performed in a heterogeneous column. The 4D nature of the data set makes it amenable to the use of data mining and pattern recognition methods, such as cluster analysis, to identify regions or zones within the data that exhibit abnormal or unexpected behaviors. We then compare anomalous features within the SPECT data to similar features identified within the CT image to relate the flow behavior to pore-scale structures, such as porosity differences and macropores. Such comparisons help to identify whether these features are good predictors of preferential transport. Likewise, we evaluate whether local analysis of moments can be used to infer apparent parameters governing non-conservative transport in a heterogeneous porous media, such

  17. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The experiments involved the simulation of conditions of a major spill by pouring crude oil on the cells from perforated cans and the in-situ bioremediation of the polluted soils using the techniques that consisted in the manipulation of different variables within the soil environment. The analysis of soil characteristics after a ...

  18. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  19. Structural and practical identifiability analysis of S-system.

    Science.gov (United States)

    Zhan, Choujun; Li, Benjamin Yee Shing; Yeung, Lam Fat

    2015-12-01

    In the field of systems biology, biological reaction networks are usually modelled by ordinary differential equations. A sub-class, the S-systems representation, is a widely used form of modelling. Existing S-systems identification techniques assume that the system itself is always structurally identifiable. However, due to practical limitations, biological reaction networks are often only partially measured. In addition, the captured data only covers a limited trajectory, therefore data can only be considered as a local snapshot of the system responses with respect to the complete set of state trajectories over the entire state space. Hence the estimated model can only reflect partial system dynamics and may not be unique. To improve the identification quality, the structural and practical identifiablility of S-system are studied. The S-system is shown to be identifiable under a set of assumptions. Then, an application on yeast fermentation pathway was conducted. Two case studies were chosen; where the first case is based on a larger state trajectories and the second case is based on a smaller one. By expanding the dataset which span a relatively larger state space, the uncertainty of the estimated system can be reduced. The results indicated that initial concentration is related to the practical identifiablity.

  20. A review of second law techniques applicable to basic thermal science research

    Science.gov (United States)

    Drost, M. Kevin; Zamorski, Joseph R.

    1988-11-01

    This paper reports the results of a review of second law analysis techniques which can contribute to basic research in the thermal sciences. The review demonstrated that second law analysis has a role in basic thermal science research. Unlike traditional techniques, second law analysis accurately identifies the sources and location of thermodynamic losses. This allows the development of innovative solutions to thermal science problems by directing research to the key technical issues. Two classes of second law techniques were identified as being particularly useful. First, system and component investigations can provide information of the source and nature of irreversibilities on a macroscopic scale. This information will help to identify new research topics and will support the evaluation of current research efforts. Second, the differential approach can provide information on the causes and spatial and temporal distribution of local irreversibilities. This information enhances the understanding of fluid mechanics, thermodynamics, and heat and mass transfer, and may suggest innovative methods for reducing irreversibilities.

  1. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  2. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  3. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  4. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  5. Development of a computerized method for identifying the posteroanterior and lateral views of chest radiographs by use of a template matching technique

    International Nuclear Information System (INIS)

    Arimura, Hidetaka; Katsuragawa, Shigehiko; Li Qiang; Ishida, Takayuki; Doi, Kunio

    2002-01-01

    In picture archiving and communications systems (PACS) or digital archiving systems, the information on the posteroanterior (PA) and lateral views for chest radiographs is often not recorded or is recorded incorrectly. However, it is necessary to identify the PA or lateral view correctly and automatically for quantitative analysis of chest images for computer-aided diagnosis. Our purpose in this study was to develop a computerized method for correctly identifying either PA or lateral views of chest radiographs. Our approach is to examine the similarity of a chest image with templates that represent the average chest images of the PA or lateral view for various types of patients. By use of a template matching technique with nine template images for patients of different size in two steps, correlation values were obtained for determining whether a chest image is either a PA or a lateral view. The templates for PA and lateral views were prepared from 447 PA and 200 lateral chest images. For a validation test, this scheme was applied to 1,000 test images consisting of 500 PA and 500 lateral chest radiographs, which are different from training cases. In the first step, 924 (92.4%) of the cases were correctly identified by comparison of the correlation values obtained with the three templates for medium-size patients. In the second step, the correlation values with the six templates for small and large patients were compared, and all of the remaining unidentifiable cases were identified correctly

  6. Control charts technique - a tool to data analysis for chemical experiments

    International Nuclear Information System (INIS)

    Yadav, M.B.; Venugopal, V.

    1999-01-01

    A procedure using control charts technique has been developed to analyse data of a chemical experiment which was conducted to assign a value to uranium content in Rb 2 U(SO 4 ) 3 . A value of (34.164 ± 0.031)% has been assigned against (34.167 ± 0.042)% already assigned by analysis of variance (ANOVA) technique. These values do not differ significantly. Merits and demerits of the two techniques have been discussed. (author)

  7. Judging complex movement performances for excellence: a principal components analysis-based technique applied to competitive diving.

    Science.gov (United States)

    Young, Cole; Reinkensmeyer, David J

    2014-08-01

    Athletes rely on subjective assessment of complex movements from coaches and judges to improve their motor skills. In some sports, such as diving, snowboard half pipe, gymnastics, and figure skating, subjective scoring forms the basis for competition. It is currently unclear whether this scoring process can be mathematically modeled; doing so could provide insight into what motor skill is. Principal components analysis has been proposed as a motion analysis method for identifying fundamental units of coordination. We used PCA to analyze movement quality of dives taken from USA Diving's 2009 World Team Selection Camp, first identifying eigenpostures associated with dives, and then using the eigenpostures and their temporal weighting coefficients, as well as elements commonly assumed to affect scoring - gross body path, splash area, and board tip motion - to identify eigendives. Within this eigendive space we predicted actual judges' scores using linear regression. This technique rated dives with accuracy comparable to the human judges. The temporal weighting of the eigenpostures, body center path, splash area, and board tip motion affected the score, but not the eigenpostures themselves. These results illustrate that (1) subjective scoring in a competitive diving event can be mathematically modeled; (2) the elements commonly assumed to affect dive scoring actually do affect scoring (3) skill in elite diving is more associated with the gross body path and the effect of the movement on the board and water than the units of coordination that PCA extracts, which might reflect the high level of technique these divers had achieved. We also illustrate how eigendives can be used to produce dive animations that an observer can distort continuously from poor to excellent, which is a novel approach to performance visualization. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  9. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  10. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  11. Robust Selection Algorithm (RSA) for Multi-Omic Biomarker Discovery; Integration with Functional Network Analysis to Identify miRNA Regulated Pathways in Multiple Cancers.

    Science.gov (United States)

    Sehgal, Vasudha; Seviour, Elena G; Moss, Tyler J; Mills, Gordon B; Azencott, Robert; Ram, Prahlad T

    2015-01-01

    MicroRNAs (miRNAs) play a crucial role in the maintenance of cellular homeostasis by regulating the expression of their target genes. As such, the dysregulation of miRNA expression has been frequently linked to cancer. With rapidly accumulating molecular data linked to patient outcome, the need for identification of robust multi-omic molecular markers is critical in order to provide clinical impact. While previous bioinformatic tools have been developed to identify potential biomarkers in cancer, these methods do not allow for rapid classification of oncogenes versus tumor suppressors taking into account robust differential expression, cutoffs, p-values and non-normality of the data. Here, we propose a methodology, Robust Selection Algorithm (RSA) that addresses these important problems in big data omics analysis. The robustness of the survival analysis is ensured by identification of optimal cutoff values of omics expression, strengthened by p-value computed through intensive random resampling taking into account any non-normality in the data and integration into multi-omic functional networks. Here we have analyzed pan-cancer miRNA patient data to identify functional pathways involved in cancer progression that are associated with selected miRNA identified by RSA. Our approach demonstrates the way in which existing survival analysis techniques can be integrated with a functional network analysis framework to efficiently identify promising biomarkers and novel therapeutic candidates across diseases.

  12. Clinical Characteristics of Exacerbation-Prone Adult Asthmatics Identified by Cluster Analysis.

    Science.gov (United States)

    Kim, Mi Ae; Shin, Seung Woo; Park, Jong Sook; Uh, Soo Taek; Chang, Hun Soo; Bae, Da Jeong; Cho, You Sook; Park, Hae Sim; Yoon, Ho Joo; Choi, Byoung Whui; Kim, Yong Hoon; Park, Choon Sik

    2017-11-01

    Asthma is a heterogeneous disease characterized by various types of airway inflammation and obstruction. Therefore, it is classified into several subphenotypes, such as early-onset atopic, obese non-eosinophilic, benign, and eosinophilic asthma, using cluster analysis. A number of asthmatics frequently experience exacerbation over a long-term follow-up period, but the exacerbation-prone subphenotype has rarely been evaluated by cluster analysis. This prompted us to identify clusters reflecting asthma exacerbation. A uniform cluster analysis method was applied to 259 adult asthmatics who were regularly followed-up for over 1 year using 12 variables, selected on the basis of their contribution to asthma phenotypes. After clustering, clinical profiles and exacerbation rates during follow-up were compared among the clusters. Four subphenotypes were identified: cluster 1 was comprised of patients with early-onset atopic asthma with preserved lung function, cluster 2 late-onset non-atopic asthma with impaired lung function, cluster 3 early-onset atopic asthma with severely impaired lung function, and cluster 4 late-onset non-atopic asthma with well-preserved lung function. The patients in clusters 2 and 3 were identified as exacerbation-prone asthmatics, showing a higher risk of asthma exacerbation. Two different phenotypes of exacerbation-prone asthma were identified among Korean asthmatics using cluster analysis; both were characterized by impaired lung function, but the age at asthma onset and atopic status were different between the two. Copyright © 2017 The Korean Academy of Asthma, Allergy and Clinical Immunology · The Korean Academy of Pediatric Allergy and Respiratory Disease

  13. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Lubna Moin

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  14. Book Review: Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects

    Directory of Open Access Journals (Sweden)

    Thomas Nash

    2013-06-01

    Full Text Available Shavers, B. (2013. Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects. Waltham, MA: Elsevier, 290 pages, ISBN-978-1-59749-985-9, US$51.56. Includes bibliographical references and index.Reviewed by Detective Corporal Thomas Nash (tnash@bpdvt.org, Burlington Vermont Police Department, Internet Crime against Children Task Force. Adjunct Instructor, Champlain College, Burlington VT.In this must read for any aspiring novice cybercrime investigator as well as the seasoned professional computer guru alike, Brett Shaver takes the reader into the ever changing and dynamic world of Cybercrime investigation.  Shaver, an experienced criminal investigator, lays out the details and intricacies of a computer related crime investigation in a clear and concise manner in his new easy to read publication, Placing the Suspect behind the Keyboard. Using Digital Forensics and Investigative techniques to Identify Cybercrime Suspects. Shaver takes the reader from start to finish through each step of the investigative process in well organized and easy to follow sections, with real case file examples to reach the ultimate goal of any investigation: identifying the suspect and proving their guilt in the crime. Do not be fooled by the title. This excellent, easily accessible reference is beneficial to both criminal as well as civil investigations and should be in every investigator’s library regardless of their respective criminal or civil investigative responsibilities.(see PDF for full review

  15. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  16. Provenance Establishment of Stingless Bee Honey Using Multi-element Analysis in Combination with Chemometrics Techniques.

    Science.gov (United States)

    Shadan, Aidil Fahmi; Mahat, Naji A; Wan Ibrahim, Wan Aini; Ariffin, Zaiton; Ismail, Dzulkiflee

    2018-01-01

    As consumption of stingless bee honey has been gaining popularity in many countries including Malaysia, ability to identify accurately its geographical origin proves pertinent for investigating fraudulent activities for consumer protection. Because a chemical signature can be location-specific, multi-element distribution patterns may prove useful for provenancing such product. Using the inductively coupled-plasma optical emission spectrometer as well as principal component analysis (PCA) and linear discriminant analysis (LDA), the distributions of multi-elements in stingless bee honey collected at four different geographical locations (North, West, East, and South) in Johor, Malaysia, were investigated. While cross-validation using PCA demonstrated 87.0% correct classification rate, the same was improved (96.2%) with the use of LDA, indicating that discrimination was possible for the different geographical regions. Therefore, utilization of multi-element analysis coupled with chemometrics techniques for assigning the provenance of stingless bee honeys for forensic applications is supported. © 2017 American Academy of Forensic Sciences.

  17. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  18. Sentiment Analysis in Geo Social Streams by using Machine Learning Techniques

    OpenAIRE

    Twanabasu, Bikesh

    2018-01-01

    Treball de Final de Màster Universitari Erasmus Mundus en Tecnologia Geoespacial (Pla de 2013). Codi: SIW013. Curs acadèmic 2017-2018 Massive amounts of sentiment rich data are generated on social media in the form of Tweets, status updates, blog post, reviews, etc. Different people and organizations are using these user generated content for decision making. Symbolic techniques or Knowledge base approaches and Machine learning techniques are two main techniques used for analysis sentiment...

  19. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    William S. Charlton

    1999-01-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  20. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  1. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  2. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  3. Elemental analysis of brazing alloy samples by neutron activation technique

    Energy Technology Data Exchange (ETDEWEB)

    Eissa, E A; Rofail, N B; Hassan, A M [Reactor and Neutron physics Department, Nuclear Research Centre, Atomic Energy Authority, Cairo (Egypt); El-Shershaby, A; Walley El-Dine, N [Physics Department, Faculty of Girls, Ain Shams Universty, Cairo (Egypt)

    1997-12-31

    Two brazing alloy samples (C P{sup 2} and C P{sup 3}) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10{sup 1}1 n/cm{sup 2}/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10{sup 1}2 n/cm{sup 2}/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab.

  4. The composite sequential clustering technique for analysis of multispectral scanner data

    Science.gov (United States)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  5. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  6. Analysis of pulse-shape discrimination techniques for BC501A using GHz digital signal processing

    International Nuclear Information System (INIS)

    Rooney, B.D.; Dinwiddie, D.R.; Nelson, M.A.; Rawool-Sullivan, Mohini W.

    2001-01-01

    A comparison study of pulse-shape analysis techniques was conducted for a BC501A scintillator using digital signal processing (DSP). In this study, output signals from a preamplifier were input directly into a 1 GHz analog-to-digital converter. The digitized data obtained with this method was post-processed for both pulse-height and pulse-shape information. Several different analysis techniques were evaluated for neutron and gamma-ray pulse-shape discrimination. It was surprising that one of the simplest and fastest techniques resulted in some of the best pulse-shape discrimination results. This technique, referred to here as the Integral Ratio technique, was able to effectively process several thousand detector pulses per second. This paper presents the results and findings of this study for various pulse-shape analysis techniques with digitized detector signals.

  7. Optimising Regionalisation Techniques: Identifying Centres of Endemism in the Extraordinarily Endemic-Rich Cape Floristic Region

    Science.gov (United States)

    Bradshaw, Peter L.; Colville, Jonathan F.; Linder, H. Peter

    2015-01-01

    We used a very large dataset (>40% of all species) from the endemic-rich Cape Floristic Region (CFR) to explore the impact of different weighting techniques, coefficients to calculate similarity among the cells, and clustering approaches on biogeographical regionalisation. The results were used to revise the biogeographical subdivision of the CFR. We show that weighted data (down-weighting widespread species), similarity calculated using Kulczinsky’s second measure, and clustering using UPGMA resulted in the optimal classification. This maximized the number of endemic species, the number of centres recognized, and operational geographic units assigned to centres of endemism (CoEs). We developed a dendrogram branch order cut-off (BOC) method to locate the optimal cut-off points on the dendrogram to define candidate clusters. Kulczinsky’s second measure dendrograms were combined using consensus, identifying areas of conflict which could be due to biotic element overlap or transitional areas. Post-clustering GIS manipulation substantially enhanced the endemic composition and geographic size of candidate CoEs. Although there was broad spatial congruence with previous phytogeographic studies, our techniques allowed for the recovery of additional phytogeographic detail not previously described for the CFR. PMID:26147438

  8. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  9. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  10. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  11. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  12. SU-F-T-248: FMEA Risk Analysis Implementation (AAPM TG-100) in Total Skin Electron Irradiation Technique

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez-Rosello, B; Bautista-Ballesteros, J; Bonaque, J [Hospital La Fe, Valencia, Valencia (Spain); Perez-Calatayud, J [Hospital La Fe, Valencia, Valencia (Spain); Clinica Benidorm, Benidorm, Alicante (Spain); Gonzalez-Sanchis, A; Lopez-Torrecilla, J; Brualla-Gonzalez, L; Garcia-Hernandez, T; Vicedo-Gonzalez, A; Granero, D; Serrano, A; Borderia, B; Solera, C [Hospital General ERESA, Valencia, Valencia (Spain); Rosello, J [Hospital General ERESA, Valencia, Valencia (Spain); Universidad de Valencia, Valencia, Valencia (Spain)

    2016-06-15

    Purpose: Total Skin Electron Irradiation (TSEI) is a radiotherapy treatment which involves irradiating the entire body surface as homogeneously as possible. It is composed of an extensive multi-step technique in which quality management requires high consumption of resources and a fluid communication between the involved staff, necessary to improve the safety of treatment. The TG-100 proposes a new perspective of quality management in radiotherapy, presenting a systematic method of risk analysis throughout the global flow of the stages through the patient. The purpose of this work has been to apply TG-100 approach to the TSEI procedure in our institution. Methods: A multidisciplinary team specifically targeting TSEI procedure was formed, that met regularly and jointly developed the process map (PM), following TG-100 guidelines of the AAPM. This PM is a visual representation of the temporal flow of steps through the patient since start until the end of his stay in the radiotherapy service. Results: This is the first stage of the full risk analysis, which is being carried out in the center. The PM provides an overview of the process and facilitates the understanding of the team members who will participate in the subsequent analysis. Currently, the team is implementing the analysis of failure modes and effects (FMEA). The failure modes of each of the steps have been identified and assessors are assigning a value of severity (S), frequency of occurrence (O) and lack of detection (D) individually. To our knowledge, this is the first PM made for the TSEI. The developed PM can be useful for those centers that intend to implement the TSEI technique. Conclusion: The PM of TSEI technique has been established, as the first stage of full risk analysis, performed in a reference center in this treatment.

  13. A microhistological technique for analysis of food habits of mycophagous rodents.

    Science.gov (United States)

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  14. A fractal analysis of skin pigmented lesions using the novel tool of the variogram technique

    Energy Technology Data Exchange (ETDEWEB)

    Mastrolonardo, Mario [Department of Medical and Occupational Sciences, Unit of Dermatology, Azienda Ospedaliero-Universitaria ' Ospedali Riuniti' di Foggia (Italy)]. E-mail: mariomastrolonardo@libero.it; Conte, Elio [Department of Medical and Occupational Sciences, Unit of Dermatology, Azienda Ospedaliero-Universitaria ' Ospedali Riuniti' di Foggia (Italy); Department of Pharmacology and Human Physiology, TIRES-Center for Innovative Technology for Signal Detection and Processing, Bari University, 70100 Bari (Italy); Zbilut, Joseph P. [Department of Molecular Biophysics and Physiology, Rush University, Chicago, IL 60612 (United States)

    2006-06-15

    The incidence of the cutaneous malignant melanoma is increasing rapidly in the world [Ferlay J, Bray F, Pisani P, et al. GLOBOCAN 2000: Cancer incidence, mortality and prevalence worldwide, Version 1.0 IARC Cancer Base no. 5. Lyon: IARC Press, 2001]. The therapeutic address requires a method having high sensitivity and capability to diagnose such disease at an early stage. We introduce a new diagnostic method based on non-linear methodologies. In detail we suggest that fractal as well as noise and chaos dynamics are the most important components responsible for genetic instability of melanocytes. As consequence we introduce the new technique of the variogram and of fractal analysis extended to the whole regions of interest of skin in order to obtain parameters able to identify the malignant lesion. In a preliminary analysis, satisfactory results are reached.

  15. VLBI FOR GRAVITY PROBE B. IV. A NEW ASTROMETRIC ANALYSIS TECHNIQUE AND A COMPARISON WITH RESULTS FROM OTHER TECHNIQUES

    International Nuclear Information System (INIS)

    Lebach, D. E.; Ratner, M. I.; Shapiro, I. I.; Bartel, N.; Bietenholz, M. F.; Lederman, J. I.; Ransom, R. R.; Campbell, R. M.; Gordon, D.; Lestrade, J.-F.

    2012-01-01

    When very long baseline interferometry (VLBI) observations are used to determine the position or motion of a radio source relative to reference sources nearby on the sky, the astrometric information is usually obtained via (1) phase-referenced maps or (2) parametric model fits to measured fringe phases or multiband delays. In this paper, we describe a 'merged' analysis technique which combines some of the most important advantages of these other two approaches. In particular, our merged technique combines the superior model-correction capabilities of parametric model fits with the ability of phase-referenced maps to yield astrometric measurements of sources that are too weak to be used in parametric model fits. We compare the results from this merged technique with the results from phase-referenced maps and from parametric model fits in the analysis of astrometric VLBI observations of the radio-bright star IM Pegasi (HR 8703) and the radio source B2252+172 nearby on the sky. In these studies we use central-core components of radio sources 3C 454.3 and B2250+194 as our positional references. We obtain astrometric results for IM Peg with our merged technique even when the source is too weak to be used in parametric model fits, and we find that our merged technique yields astrometric results superior to the phase-referenced mapping technique. We used our merged technique to estimate the proper motion and other astrometric parameters of IM Peg in support of the NASA/Stanford Gravity Probe B mission.

  16. Evaluation of Building Projects Using Earned Value Technique ...

    African Journals Online (AJOL)

    This study evaluates building construction projects using the Earned Value Analysis technique, the Experimental Approach, and Value Concept Analysis. The aim was to compare the cost incurred for an identified amount of work done on a project with the cost budgeted for the same work. The results were used to calculate ...

  17. Application of optimal estimation techniques to FFTF decay heat removal analysis

    International Nuclear Information System (INIS)

    Nutt, W.T.; Additon, S.L.; Parziale, E.A.

    1979-01-01

    The verification and adjustment of plant models for decay heat removal analysis using a mix of engineering judgment and formal techniques from control theory are discussed. The formal techniques facilitate dealing with typical test data which are noisy, redundant and do not measure all of the plant model state variables directly. Two pretest examples are presented. 5 refs

  18. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  19. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  20. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  1. Techniques to extract physical modes in model-independent analysis of rings

    International Nuclear Information System (INIS)

    Wang, C.-X.

    2004-01-01

    A basic goal of Model-Independent Analysis is to extract the physical modes underlying the beam histories collected at a large number of beam position monitors so that beam dynamics and machine properties can be deduced independent of specific machine models. Here we discuss techniques to achieve this goal, especially the Principal Component Analysis and the Independent Component Analysis.

  2. Optimizing the design and operation of reactor emergency systems using reliability analysis techniques

    International Nuclear Information System (INIS)

    Snaith, E.R.

    1975-01-01

    Following a reactor trip various reactor emergency systems, e.g. essential power supplies, emergency core cooling and boiler feed water arrangements are required to operate with a high degree of reliability. These systems must therefore be critically assessed to confirm their capability of operation and determine their reliability of performance. The use of probability analysis techniques enables the potential operating reliability of the systems to be calculated and this can then be compared with the overall reliability requirements. However, a system reliability analysis does much more than calculate an overall reliability value for the system. It establishes the reliability of all parts of the system and thus identifies the most sensitive areas of unreliability. This indicates the areas where any required improvements should be made and enables the overall systems' designs and modes of operation to be optimized, to meet the system and hence the overall reactor safety criteria. This paper gives specific examples of sensitive areas of unreliability that were identified as a result of a reliability analysis that was carried out on a reactor emergency core cooling system. Details are given of modifications to design and operation that were implemented with a resulting improvement in reliability of various reactor sub-systems. The report concludes that an initial calculation of system reliability should represent only the beginning of continuing process of system assessment. Data on equipment and system performance, particularly in those areas shown to be sensitive in their effect on the overall nuclear power plant reliability, should be collected and processed to give reliability data. These data should then be applied in further probabilistic analyses and the results correlated with the original analysis. This will demonstrate whether the required and the originally predicted system reliability is likely to be achieved, in the light of the actual history to date of

  3. Review of sample preparation techniques for the analysis of pesticide residues in soil.

    Science.gov (United States)

    Tadeo, José L; Pérez, Rosa Ana; Albero, Beatriz; García-Valcárcel, Ana I; Sánchez-Brunete, Consuelo

    2012-01-01

    This paper reviews the sample preparation techniques used for the analysis of pesticides in soil. The present status and recent advances made during the last 5 years in these methods are discussed. The analysis of pesticide residues in soil requires the extraction of analytes from this matrix, followed by a cleanup procedure, when necessary, prior to their instrumental determination. The optimization of sample preparation is a very important part of the method development that can reduce the analysis time, the amount of solvent, and the size of samples. This review considers all aspects of sample preparation, including extraction and cleanup. Classical extraction techniques, such as shaking, Soxhlet, and ultrasonic-assisted extraction, and modern techniques like pressurized liquid extraction, microwave-assisted extraction, solid-phase microextraction and QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) are reviewed. The different cleanup strategies applied for the purification of soil extracts are also discussed. In addition, the application of these techniques to environmental studies is considered.

  4. An Integrated Approach to Change the Outcome Part II: Targeted Neuromuscular Training Techniques to Reduce Identified ACL Injury Risk Factors

    Science.gov (United States)

    Myer, Gregory D.; Ford, Kevin R.; Brent, Jensen L.; Hewett, Timothy E.

    2014-01-01

    Prior reports indicate that female athletes who demonstrate high knee abduction moments (KAMs) during landing are more responsive to neuromuscular training designed to reduce KAM. Identification of female athletes who demonstrate high KAM, which accurately identifies those at risk for noncontact anterior cruciate ligament (ACL) injury, may be ideal for targeted neuromuscular training. Specific neuromuscular training targeted to the underlying biomechanical components that increase KAM may provide the most efficient and effective training strategy to reduce noncontact ACL injury risk. The purpose of the current commentary is to provide an integrative approach to identify and target mechanistic underpinnings to increased ACL injury in female athletes. Specific neuromuscular training techniques will be presented that address individual algorithm components related to high knee load landing patterns. If these integrated techniques are employed on a widespread basis, prevention strategies for noncontact ACL injury among young female athletes may prove both more effective and efficient. PMID:22580980

  5. Comparative study of macrotexture analysis using X-ray diffraction and electron backscattered diffraction techniques

    International Nuclear Information System (INIS)

    Serna, Marilene Morelli

    2002-01-01

    The macrotexture is one of the main characteristics in metallic materials, which the physical properties depend on the crystallographic direction. The analysis of the macrotexture to middles of the decade of 80 was just accomplished by the techniques of Xray diffraction and neutrons diffraction. The possibility of the analysis of the macrotexture using, the technique of electron backscattering diffraction in the scanning electronic microscope, that allowed to correlate the measure of the orientation with its location in the micro structure, was a very welcome tool in the area of engineering of materials. In this work it was studied the theoretical aspects of the two techniques and it was used of both techniques for the analysis of the macrotexture of aluminum sheets 1050 and 3003 with intensity, measured through the texture index 'J', from 2.00 to 5.00. The results obtained by the two techniques were shown reasonably similar, being considered that the statistics of the data obtained by the technique of electron backscatter diffraction is much inferior to the obtained by the X-ray diffraction. (author)

  6. Development of Electronic Nose and Near Infrared Spectroscopy Analysis Techniques to Monitor the Critical Time in SSF Process of Feed Protein

    Directory of Open Access Journals (Sweden)

    Hui Jiang

    2014-10-01

    Full Text Available In order to assure the consistency of the final product quality, a fast and effective process monitoring is a growing need in solid state fermentation (SSF industry. This work investigated the potential of non-invasive techniques combined with the chemometrics method, to monitor time-related changes that occur during SSF process of feed protein. Four fermentation trials conducted were monitored by an electronic nose device and a near infrared spectroscopy (NIRS spectrometer. Firstly, principal component analysis (PCA and independent component analysis (ICA were respectively applied to the feature extraction and information fusion. Then, the BP_AdaBoost algorithm was used to develop the fused model for monitoring of the critical time in SSF process of feed protein. Experimental results showed that the identified results of the fusion model are much better than those of the single technique model both in the training and validation sets, and the complexity of the fusion model was also less than that of the single technique model. The overall results demonstrate that it has a high potential in online monitoring of the critical moment in SSF process by use of integrating electronic nose and NIRS techniques, and data fusion from multi-technique could significantly improve the monitoring performance of SSF process.

  7. The use of a social network analysis technique to investigate the characteristics of crew communications in nuclear power plants-A feasibility study

    International Nuclear Information System (INIS)

    Park, Jinkyun

    2011-01-01

    Effective and reliable communications are very important in securing the safety of human-involved large process control systems because human operators have to accomplish their tasks in cooperative ways. This means that it is very important to understand the characteristics of crew communications, which can provide useful insights for preventing inappropriate communications. Unfortunately, in the nuclear industry, a systematic framework that can be used to identify the characteristics of crew communications seems to be rare. For this reason, the applicability of the social network analysis (SNA) technique to identifying the characteristics of crew communications was investigated in this study. To this end, the communication data of operating crews working in the main control room (MCR) of nuclear power plants (NPPs) were collected under two kinds of simulated off-normal conditions. Then the communication characteristics of MCR operating crews, which can be represented by the associated SNA metrics, were compared with communication characteristics that are already known from existing studies. As a result, it was found that SNA metrics could be meaningful for explaining the communication characteristics of MCR operating crews. Accordingly, it is expected that the SNA technique can be used as one of the serviceable tools to investigate the characteristics of crew communications in NPPs. - Highlights: → Communications are very important for the safety of complicated socio-technical systems. → A systematic framework to identify communication characteristics seems to be rare. → The feasibility of the social network analysis (SNA) technique was investigated. → It is expected that SNA metrics is meaningful for explaining communication characteristics.

  8. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  9. Techniques and Emerging Trends for State of the Art Equipment Maintenance Systems—A Bibliometric Analysis

    Directory of Open Access Journals (Sweden)

    Burkhard Hoppenstedt

    2018-06-01

    Full Text Available The increasing interconnection of machines in industrial production on one hand, and the improved capabilities to store, retrieve, and analyze large amounts of data on the other, offer promising perspectives for maintaining production machines. Recently, predictive maintenance has gained increasing attention in the context of equipment maintenance systems. As opposed to other approaches, predictive maintenance relies on machine behavior models, which offer several advantages. In this highly interdisciplinary field, there is a lack of a literature review of relevant research fields and realization techniques. To obtain a comprehensive overview on the state of the art, large data sets of relevant literature need to be considered and, best case, be automatically partitioned into relevant research fields. A proper methodology to obtain such an overview is the bibliometric analysis method. In the presented work, we apply a bibliometric analysis to the field of equipment maintenance systems. To be more precise, we analyzed clusters of identified literature with the goal to obtain deeper insight into the related research fields. Moreover, cluster metrics reveal the importance of a single paper and an investigation of the temporal cluster development indicates the evolution of research topics. In this context, we introduce a new measure to compare results from different time periods in an appropriate way. In turn, among others, this simplifies the analysis of topics, with a vast amount of subtopics. Altogether, the obtained results particularly provide a comprehensive overview of established techniques and emerging trends for equipment maintenance systems.

  10. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  11. Automated local bright feature image analysis of nuclear proteindistribution identifies changes in tissue phenotype

    Energy Technology Data Exchange (ETDEWEB)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-02-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues.

  12. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Performance evaluation using bootstrapping DEA techniques: Evidence from industry ratio analysis

    OpenAIRE

    Halkos, George; Tzeremes, Nickolaos

    2010-01-01

    In Data Envelopment Analysis (DEA) context financial data/ ratios have been used in order to produce a unified measure of performance metric. However, several scholars have indicated that the inclusion of financial ratios create biased efficiency estimates with implications on firms’ and industries’ performance evaluation. There have been several DEA formulations and techniques dealing with this problem including sensitivity analysis, Prior-Ratio-Analysis and DEA/ output–input ratio analysis ...

  14. Runoff Potentiality of a Watershed through SCS and Functional Data Analysis Technique

    Directory of Open Access Journals (Sweden)

    M. I. Adham

    2014-01-01

    Full Text Available Runoff potentiality of a watershed was assessed based on identifying curve number (CN, soil conservation service (SCS, and functional data analysis (FDA techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling.

  15. Runoff Potentiality of a Watershed through SCS and Functional Data Analysis Technique

    Science.gov (United States)

    Adham, M. I.; Shirazi, S. M.; Othman, F.; Rahman, S.; Yusop, Z.; Ismail, Z.

    2014-01-01

    Runoff potentiality of a watershed was assessed based on identifying curve number (CN), soil conservation service (SCS), and functional data analysis (FDA) techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling. PMID:25152911

  16. Characterisation of air particulate matter in Klang Valley by neutron activation analysis technique

    International Nuclear Information System (INIS)

    Mohd Suhaimi Hamzah; Shamsiah Abd Rahman; Mohd Khalid Matori; Abd Khalik Wood

    2000-01-01

    Air particulate matter is known to affect human health, impairs visibility and can cause climate change. Study on air particulate matter in term of particle size and chemical contents is very important to indicate the quality of air in a sampling area. Information on concentration of important constituents in air particles can be used to identify some of emission sources which contribute to the pollution problem. The data collected may also be, used as a basis to design a strategy in order to overcome the air pollution problem in the area. The study involved sampling of air dust at two stations, one in Bangi and the other in Kuala Lumpur using Gent Stack Sampler units. Each sampler capable of collecting air particle sizes smaller than 2.5 micron (PM 2.5) and between 2.5 - O micron on two different filters simultaneously. The filters were measured for their mass, elemental carbon and elemental concentrations using analytical equipment or techniques including reflectometer and Neutron Activation Analysis. The results of analysis on samples collected in 1997-1998 are discussed. (author)

  17. Application of cluster analysis to geochemical compositional data for identifying ore-related geochemical anomalies

    Science.gov (United States)

    Zhou, Shuguang; Zhou, Kefa; Wang, Jinlin; Yang, Genfang; Wang, Shanshan

    2017-12-01

    Cluster analysis is a well-known technique that is used to analyze various types of data. In this study, cluster analysis is applied to geochemical data that describe 1444 stream sediment samples collected in northwestern Xinjiang with a sample spacing of approximately 2 km. Three algorithms (the hierarchical, k-means, and fuzzy c-means algorithms) and six data transformation methods (the z-score standardization, ZST; the logarithmic transformation, LT; the additive log-ratio transformation, ALT; the centered log-ratio transformation, CLT; the isometric log-ratio transformation, ILT; and no transformation, NT) are compared in terms of their effects on the cluster analysis of the geochemical compositional data. The study shows that, on the one hand, the ZST does not affect the results of column- or variable-based (R-type) cluster analysis, whereas the other methods, including the LT, the ALT, and the CLT, have substantial effects on the results. On the other hand, the results of the row- or observation-based (Q-type) cluster analysis obtained from the geochemical data after applying NT and the ZST are relatively poor. However, we derive some improved results from the geochemical data after applying the CLT, the ILT, the LT, and the ALT. Moreover, the k-means and fuzzy c-means clustering algorithms are more reliable than the hierarchical algorithm when they are used to cluster the geochemical data. We apply cluster analysis to the geochemical data to explore for Au deposits within the study area, and we obtain a good correlation between the results retrieved by combining the CLT or the ILT with the k-means or fuzzy c-means algorithms and the potential zones of Au mineralization. Therefore, we suggest that the combination of the CLT or the ILT with the k-means or fuzzy c-means algorithms is an effective tool to identify potential zones of mineralization from geochemical data.

  18. Analysis and modification of blue sapphires from Rwanda by ion beam techniques

    International Nuclear Information System (INIS)

    Bootkul, D.; Chaiwai, C.; Tippawan, U.; Wanthanachaisaeng, B.; Intarasiri, S.

    2015-01-01

    Highlights: • Ion beam analysis is an effective method for detecting trace elements. • Ion beam treatment is able to improve optical and color appearances of the blue sapphire from Rwanda. • These alternative methods can be extended to jewelry industry for large scale application. - Abstract: Blue sapphire is categorised in a corundum (Al_2O_3) group. The gems of this group are always amazed by their beauties and thus having high value. In this study, blue sapphires from Rwanda, recently came to Thai gemstone industry, are chosen for investigations. On one hand, we have applied Particle Induced X-ray Emission (PIXE), which is a highly sensitive and precise analytical technique that can be used to identify and quantify trace elements, for chemical analysis of the sapphires. Here we have found that the major element of blue sapphires from Rwanda is Al with trace elements such as Fe, Ti, Cr, Ga and Mg as are commonly found in normal blue sapphire. On the other hand, we have applied low and medium ion implantations for color improvement of the sapphire. It seems that a high amount of energy transferring during cascade collisions have altered the gems properties. We have clearly seen that the blue color of the sapphires have been intensified after nitrogen ion bombardment. In addition, the gems were also having more transparent and luster. The UV–Vis–NIR measurement detected the modification of their absorption properties, implying of the blue color increasing. Here the mechanism of these modifications is postulated and reported. In any point of view, the bombardment by using nitrogen ion beam is a promising technique for quality improvement of the blue sapphire from Rwanda.

  19. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  20. Analysis of initial changes in the proteins of soybean root tip under flooding stress using gel-free and gel-based proteomic techniques.

    Science.gov (United States)

    Yin, Xiaojian; Sakata, Katsumi; Nanjo, Yohei; Komatsu, Setsuko

    2014-06-25

    Flooding has a severe negative effect on soybean cultivation in the early stages of growth. To obtain a better understanding of the response mechanisms of soybean to flooding stress, initial changes in root tip proteins under flooding were analyzed using two proteomic techniques. Two-day-old soybeans were treated with flooding for 3, 6, 12, and 24h. The weight of soybeans increased during the first 3h of flooding, but root elongation was not observed. Using gel-based and gel-free proteomic techniques, 115 proteins were identified in root tips, of which 9 proteins were commonly detected by both methods. The 71 proteins identified by the gel-free proteomics were analyzed by a hierarchical clustering method based on induction levels during the flooding, and the proteins were divided into 5 clusters. Additional interaction analysis of the proteins revealed that ten proteins belonging to cluster I formed the center of a protein interaction network. mRNA expression analysis of these ten proteins showed that citrate lyase and heat shock protein 70 were down-regulated, whereas calreticulin was up-regulated in initial phase of flooding. These results suggest that flooding stress to soybean induces calcium-related signal transduction, which might play important roles in the early responses to flooding. Flooding has a severe negative effect on soybean cultivation, particularly in the early stages of growth. To better understand the response mechanisms of soybean to the early stages of flooding stress, two proteomic techniques were used. Two-day-old soybeans were treated without or with flooding for 3, 6, 12, and 24h. The fresh weight of soybeans increased during the first 3h of flooding stress, but the growth then slowed and no root elongation was observed. Using gel-based and gel-free proteomic techniques, 115 proteins were identified in root tips, of which 9 proteins were commonly detected by both methods. The 71 proteins identified by the gel-free proteomics were analyzed

  1. Analysis of fresh fallout from Chinese tests by beta counting technique

    International Nuclear Information System (INIS)

    Mishra, U.C.; Lalit, B.Y.; Shukla, V.K.; Ramachandran, T.V.

    1979-01-01

    The paper describes beta counting techniques used in the analysis of fresh radioactive fallout samples from nuclear weapon tests. Fresh fallout samples have been collected by swiping the exposed portion of the engine covers of commercial aircrafts arriving at Bombay from New York after Chinese tests on September 26, 1976 and September 17, 1977. Activities of short-lived radionuclides such as Ag 111, Sr 89, Mo 99, U 237 and Np 239 were determined using these techniques. The results obtained from this analysis is discussed in brief in relation to the kind of fissile material, the extent of thermonuclear reaction in the weapon and the mode of detonation. (orig.) [de

  2. Obesogenic family types identified through latent profile analysis.

    Science.gov (United States)

    Martinson, Brian C; VazquezBenitez, Gabriela; Patnode, Carrie D; Hearst, Mary O; Sherwood, Nancy E; Parker, Emily D; Sirard, John; Pasch, Keryn E; Lytle, Leslie

    2011-10-01

    Obesity may cluster in families due to shared physical and social environments. This study aims to identify family typologies of obesity risk based on family environments. Using 2007-2008 data from 706 parent/youth dyads in Minnesota, we applied latent profile analysis and general linear models to evaluate associations between family typologies and body mass index (BMI) of youth and parents. Three typologies described most families with 18.8% "Unenriched/Obesogenic," 16.9% "Risky Consumer," and 64.3% "Healthy Consumer/Salutogenic." After adjustment for demographic and socioeconomic factors, parent BMI and youth BMI Z-scores were higher in unenriched/obesogenic families (BMI difference = 2.7, p typology. In contrast, parent BMI and youth BMI Z-scores were similar in the risky consumer families relative to those in healthy consumer/salutogenic type. We can identify family types differing in obesity risks with implications for public health interventions.

  3. An Analysis of Pre-Infection Detection Techniques for Botnets and other Malware

    OpenAIRE

    Graham, Mark; Winckles, Adrian

    2014-01-01

    Traditional techniques for detecting malware, such as viruses, worms and rootkits, rely on identifying virus-specific signature definitions within network traffic, applications or memory. Because a sample of malware is required to define an attack signature, signature detection has drawbacks when accounting for malware code mutation, has limited use in zero-day protection and is a post-infection technique requiring malware to be present on a device in order to be detected. \\ud A malicious bot...

  4. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    Science.gov (United States)

    Mai, J.; Tolson, B.

    2017-12-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an

  5. BIOELECTRICAL IMPEDANCE VECTOR ANALYSIS IDENTIFIES SARCOPENIA IN NURSING HOME RESIDENTS

    Science.gov (United States)

    Loss of muscle mass and water shifts between body compartments are contributing factors to frailty in the elderly. The body composition changes are especially pronounced in institutionalized elderly. We investigated the ability of single-frequency bioelectrical impedance analysis (BIA) to identify b...

  6. Identifying Students’ Misconceptions on Basic Algorithmic Concepts Through Flowchart Analysis

    NARCIS (Netherlands)

    Rahimi, E.; Barendsen, E.; Henze, I.; Dagienė, V.; Hellas, A.

    2017-01-01

    In this paper, a flowchart-based approach to identifying secondary school students’ misconceptions (in a broad sense) on basic algorithm concepts is introduced. This approach uses student-generated flowcharts as the units of analysis and examines them against plan composition and construct-based

  7. A Comparative Analysis of Uranium Ore using Laser Fluorimetric and gamma Spectrometry Techniques

    International Nuclear Information System (INIS)

    Madbouly, M.; Nassef, M. H.; El-Mongy, S.A.; Diab, A.M.

    2009-01-01

    A developed chemical separation method was used for the analysis of uranium in a standard U-ore (IAEA-RGU-1) by LASER fluorimetric technique. The non-destructive gamma assay technique was also applied to verify and compare the uranium content analyzed using laser technique. The results of the uranium analysis obtained by laser fluorimetry were found to be in the range of 360 - 420 μg/g with an average value of 390 μg/g. The bias between the measured and the certified value does not exceed 9.9%. For gamma-ray spectrometric analysis, the results of the measured uranium content were found to be in the range of 393.8 - 399.4 μg/g with an average value of 396.3 μg/g. The % difference in the case of γ- assay was 1.6 %. In general, the methods of analysis used in this study are applicable for a precise determination of uranium. It can be concluded that, laser analysis is preferred for assay of uranium ore due to the required small sample weight, the low time of sample preparation and cost of analysis.

  8. Analysis of thermal fluctuations in the semiscale tests to determine flow transit delay times using a transfer function cross-correlation technique

    International Nuclear Information System (INIS)

    Raptis, A.C.; Popper, G.F.

    1977-08-01

    On April 14, 1976, EG and G performed the Semiscale Blowdown 29-1 experiment to try to establish the feasibility of using a transit time flowmeter (TTF) to measure transient blowdown two-phase flow rates. The recorded signals from that experiment were made available to and analyzed by the Argonne National Laboratory using the transfer function cross-correlation technique. The theoretical background for the transfer function method of analysis and the results of the data analysis are presented. Histograms of transit time during the blowdown are shown and topics for further investigation are identified

  9. Learning a novel technique to identify possible melanomas: are Australian general practitioners better than their U.K. colleagues?

    Directory of Open Access Journals (Sweden)

    Watson Tony

    2009-04-01

    Full Text Available Abstract Background Spectrophotometric intracutaneous analysis (SIAscopy™ is a multispectral imaging technique that is used to identify 'suspicious' (i.e. potentially malignant pigmented skin lesions for further investigation. The MoleMate™ system is a hand-held scanner that captures SIAscopy™ images that are then classified by the clinician using a computerized diagnostic algorithm designed for the primary health care setting. The objectives of this study were to test the effectiveness of a computer program designed to train health care workers to identify the diagnostic features of SIAscopy™ images and compare the results of a group of Australian and a group of English general practitioners (GPs. Methods Thirty GPs recruited from the Perth (Western Australia metropolitan area completed the training program at a workshop held in March 2008. The accuracy and speed of their pre- and post-test scores were then compared with those of a group of 18 GPs (including 10 GP registrars who completed a similar program at two workshops held in Cambridge (U.K. in March and April, 2007. Results The median test score of the Australian GPs improved from 79.5% to 86.5% (median increase 5.5%; p Conclusion Most of the SIAscopy™ features can be learnt to a reasonable degree of accuracy with this brief computer training program. Although the Australian GPs scored higher in the pre-test, both groups had similar levels of accuracy and speed in interpreting the SIAscopy™ features after completing the program. Scores were not affected by previous dermoscopy experience or dermatology training, which suggests that the MoleMate™ system is relatively easy to learn.

  10. Genome-wide meta-analysis identifies new susceptibility loci for migraine.

    Science.gov (United States)

    Anttila, Verneri; Winsvold, Bendik S; Gormley, Padhraig; Kurth, Tobias; Bettella, Francesco; McMahon, George; Kallela, Mikko; Malik, Rainer; de Vries, Boukje; Terwindt, Gisela; Medland, Sarah E; Todt, Unda; McArdle, Wendy L; Quaye, Lydia; Koiranen, Markku; Ikram, M Arfan; Lehtimäki, Terho; Stam, Anine H; Ligthart, Lannie; Wedenoja, Juho; Dunham, Ian; Neale, Benjamin M; Palta, Priit; Hamalainen, Eija; Schürks, Markus; Rose, Lynda M; Buring, Julie E; Ridker, Paul M; Steinberg, Stacy; Stefansson, Hreinn; Jakobsson, Finnbogi; Lawlor, Debbie A; Evans, David M; Ring, Susan M; Färkkilä, Markus; Artto, Ville; Kaunisto, Mari A; Freilinger, Tobias; Schoenen, Jean; Frants, Rune R; Pelzer, Nadine; Weller, Claudia M; Zielman, Ronald; Heath, Andrew C; Madden, Pamela A F; Montgomery, Grant W; Martin, Nicholas G; Borck, Guntram; Göbel, Hartmut; Heinze, Axel; Heinze-Kuhn, Katja; Williams, Frances M K; Hartikainen, Anna-Liisa; Pouta, Anneli; van den Ende, Joyce; Uitterlinden, Andre G; Hofman, Albert; Amin, Najaf; Hottenga, Jouke-Jan; Vink, Jacqueline M; Heikkilä, Kauko; Alexander, Michael; Muller-Myhsok, Bertram; Schreiber, Stefan; Meitinger, Thomas; Wichmann, Heinz Erich; Aromaa, Arpo; Eriksson, Johan G; Traynor, Bryan; Trabzuni, Daniah; Rossin, Elizabeth; Lage, Kasper; Jacobs, Suzanne B R; Gibbs, J Raphael; Birney, Ewan; Kaprio, Jaakko; Penninx, Brenda W; Boomsma, Dorret I; van Duijn, Cornelia; Raitakari, Olli; Jarvelin, Marjo-Riitta; Zwart, John-Anker; Cherkas, Lynn; Strachan, David P; Kubisch, Christian; Ferrari, Michel D; van den Maagdenberg, Arn M J M; Dichgans, Martin; Wessman, Maija; Smith, George Davey; Stefansson, Kari; Daly, Mark J; Nyholt, Dale R; Chasman, Daniel; Palotie, Aarno

    2013-08-01

    Migraine is the most common brain disorder, affecting approximately 14% of the adult population, but its molecular mechanisms are poorly understood. We report the results of a meta-analysis across 29 genome-wide association studies, including a total of 23,285 individuals with migraine (cases) and 95,425 population-matched controls. We identified 12 loci associated with migraine susceptibility (P<5×10(-8)). Five loci are new: near AJAP1 at 1p36, near TSPAN2 at 1p13, within FHL5 at 6q16, within C7orf10 at 7p14 and near MMP16 at 8q21. Three of these loci were identified in disease subgroup analyses. Brain tissue expression quantitative trait locus analysis suggests potential functional candidate genes at four loci: APOA1BP, TBC1D7, FUT9, STAT6 and ATP5B.

  11. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  12. Sensitivity analysis techniques applied to a system of hyperbolic conservation laws

    International Nuclear Information System (INIS)

    Weirs, V. Gregory; Kamm, James R.; Swiler, Laura P.; Tarantola, Stefano; Ratto, Marco; Adams, Brian M.; Rider, William J.; Eldred, Michael S.

    2012-01-01

    Sensitivity analysis is comprised of techniques to quantify the effects of the input variables on a set of outputs. In particular, sensitivity indices can be used to infer which input parameters most significantly affect the results of a computational model. With continually increasing computing power, sensitivity analysis has become an important technique by which to understand the behavior of large-scale computer simulations. Many sensitivity analysis methods rely on sampling from distributions of the inputs. Such sampling-based methods can be computationally expensive, requiring many evaluations of the simulation; in this case, the Sobol' method provides an easy and accurate way to compute variance-based measures, provided a sufficient number of model evaluations are available. As an alternative, meta-modeling approaches have been devised to approximate the response surface and estimate various measures of sensitivity. In this work, we consider a variety of sensitivity analysis methods, including different sampling strategies, different meta-models, and different ways of evaluating variance-based sensitivity indices. The problem we consider is the 1-D Riemann problem. By a careful choice of inputs, discontinuous solutions are obtained, leading to discontinuous response surfaces; such surfaces can be particularly problematic for meta-modeling approaches. The goal of this study is to compare the estimated sensitivity indices with exact values and to evaluate the convergence of these estimates with increasing samples sizes and under an increasing number of meta-model evaluations. - Highlights: ► Sensitivity analysis techniques for a model shock physics problem are compared. ► The model problem and the sensitivity analysis problem have exact solutions. ► Subtle details of the method for computing sensitivity indices can affect the results.

  13. Short communication: cheminformatics analysis to identify predictors of antiviral drug penetration into the female genital tract.

    Science.gov (United States)

    Thompson, Corbin G; Sedykh, Alexander; Nicol, Melanie R; Muratov, Eugene; Fourches, Denis; Tropsha, Alexander; Kashuba, Angela D M

    2014-11-01

    The exposure of oral antiretroviral (ARV) drugs in the female genital tract (FGT) is variable and almost unpredictable. Identifying an efficient method to find compounds with high tissue penetration would streamline the development of regimens for both HIV preexposure prophylaxis and viral reservoir targeting. Here we describe the cheminformatics investigation of diverse drugs with known FGT penetration using cluster analysis and quantitative structure-activity relationships (QSAR) modeling. A literature search over the 1950-2012 period identified 58 compounds (including 21 ARVs and representing 13 drug classes) associated with their actual concentration data for cervical or vaginal tissue, or cervicovaginal fluid. Cluster analysis revealed significant trends in the penetrative ability for certain chemotypes. QSAR models to predict genital tract concentrations normalized to blood plasma concentrations were developed with two machine learning techniques utilizing drugs' molecular descriptors and pharmacokinetic parameters as inputs. The QSAR model with the highest predictive accuracy had R(2)test=0.47. High volume of distribution, high MRP1 substrate probability, and low MRP4 substrate probability were associated with FGT concentrations ≥1.5-fold plasma concentrations. However, due to the limited FGT data available, prediction performances of all models were low. Despite this limitation, we were able to support our findings by correctly predicting the penetration class of rilpivirine and dolutegravir. With more data to enrich the models, we believe these methods could potentially enhance the current approach of clinical testing.

  14. Characterization of exposure to extremely low frequency magnetic fields using multidimensional analysis techniques.

    Science.gov (United States)

    Verrier, A; Souques, M; Wallet, F

    2005-05-01

    Our lack of knowledge about the biological mechanisms of 50 Hz magnetic fields makes it hard to improve exposure assessment. To provide better information about these exposure measures, we use multidimensional analysis techniques to examine the relations between different exposure metrics for a group of subjects. We used a combination of a two stage Principal Component Analysis (PCA) followed by an ascending hierarchical classification (AHC) to identify a set of measures that would capture the characteristics of the total exposure. This analysis gives an indication of the aspects of the exposure that are important to capture to get a complete picture of the magnetic field environment. We calculated 44 metrics of exposure measures from 16 exposed EDF employees and 15 control subjects, containing approximately 20,000 recordings of magnetic field measurements, taken every 30 s for 7 days with an EMDEX II dosimeter. These metrics included parameters used routinely or occasionally and some that were new. To eliminate those that expressed the least variability and that were most highly correlated to one another, we began with an initial Principal Component Analysis (PCA). A second PCA of the remaining 12 metrics enabled us to identify from the foreground 82.7% of the variance: the first component (62.0%) was characterized by central tendency metrics, and the second (20.7%) by dispersion characteristics. We were able to use AHC to divide the entire sample (of individuals) into four groups according to the axes that emerged from the PCA. Finally, discriminant analysis tested the discriminant power of the variables in the exposed/control classification as well as those from the AHC classification. The first showed that two subjects had been incorrectly classified, while no classification error was observed in the second. This exploratory study underscores the need to improve exposure measures by using at least two dimensions: intensity and dispersion. It also indicates the

  15. Identifying modes of large whispering-gallery mode resonators from the spectrum and emission pattern

    DEFF Research Database (Denmark)

    Schunk, Gerhard; Fuerst, Josef U.; Förtsch, Michael

    2014-01-01

    Identifying the mode numbers in whispering-gallery mode resonators (WGMRs) is important for tailoring them to experimental needs. Here we report on a novel experimental mode analysis technique based on the combination of frequency analysis and far-field imaging for high mode numbers of large WGMR...

  16. Application of the failure modes and effects analysis technique to theemergency cooling system of an experimental nuclear power plant

    International Nuclear Information System (INIS)

    Conceicao Junior, Osmar

    2009-01-01

    This study consists on the application of the Failure Modes and EffectsAnalysis (FMEA), a hazard identification and a risk assessment technique, tothe Emergency Cooling System (ECS) of an experimental nuclear power plant,which is responsible for mitigating the consequences of an eventual loss ofcoolant accident on the Pressurized Water Reactor (PWR). Such analysisintends to identify possible weaknesses on the design of the system andpropose some improvements in order to maximize its reliability. To achievethis goal a detailed study of the system was carried on (through itstechnical documentation), the correspondent reliability block diagram wasobtained, the FMEA analysis was executed and, finally, some suggestions werepresented. (author)

  17. ATHEANA: A Technique for Human Error Analysis: An Overview of Its Methodological Basis

    International Nuclear Information System (INIS)

    Wreathall, John; Ramey-Smith, Ann

    1998-01-01

    The U.S. NRC has developed a new human reliability analysis (HRA) method, called A Technique for Human Event Analysis (ATHEANA), to provide a way of modeling the so-called 'errors of commission' - that is, situations in which operators terminate or disable engineered safety features (ESFs) or similar equipment during accident conditions, thereby putting the plant at an increased risk of core damage. In its reviews of operational events, NRC has found that these errors of commission occur with a relatively high frequency (as high as 2 or 3 per year), but are noticeably missing from the scope of most current probabilistic risk assessments (PRAs). This new method was developed through a formalized approach that describes what can occur when operators behave rationally but have inadequate knowledge or poor judgement. In particular, the method is based on models of decision-making and response planning that have been used extensively in the aviation field, and on the analysis of major accidents in both the nuclear and non-nuclear fields. Other papers at this conference present summaries of these event analyses in both the nuclear and non-nuclear fields. This paper presents an overview of ATHEANA and summarizes how the method structures the analysis of operationally significant events, and helps HRA analysts identify and model potentially risk-significant errors of commission in plant PRAs. (authors)

  18. Identifying clinical course patterns in SMS data using cluster analysis

    DEFF Research Database (Denmark)

    Kent, Peter; Kongsted, Alice

    2012-01-01

    ABSTRACT: BACKGROUND: Recently, there has been interest in using the short message service (SMS or text messaging), to gather frequent information on the clinical course of individual patients. One possible role for identifying clinical course patterns is to assist in exploring clinically important...... showed that clinical course patterns can be identified by cluster analysis using all SMS time points as cluster variables. This method is simple, intuitive and does not require a high level of statistical skill. However, there are alternative ways of managing SMS data and many different methods...

  19. Novel technique for coal pyrolysis and hydrogenation production analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1990-01-01

    The overall objective of this study is to establish vacuum ultraviolet photoionization-MS and VUV pulsed EI-MS as useful tools for a simpler and more accurate direct mass spectrometric measurement of a broad range of hydrocarbon compounds in complex mixtures for ultimate application to the study of the kinetics of coal hydrogenation and pyrolysis processes. The VUV-MS technique allows ionization of a broad range of species with minimal fragmentation. Many compounds of interest can be detected with the 118 nm wavelength, but additional compound selectivity is achievable by tuning the wavelength of the photo-ionization source in the VUV. Resonant four wave mixing techniques in Hg vapor will allow near continuous tuning from about 126 to 106 nm. This technique would facilitate the scientific investigation of coal upgrading processes such as pyrolysis and hydrogenation by allowing accurate direct analysis of both stable and intermediate reaction products.

  20. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Identifying the tundra-forest border in the stomate record: an analysis of lake surface samples from the Yellowknife area, Northwest Territories, Canada

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, B.C.S. [Minnesota Univ., Minneapolis, MN (United States). Limnological Research Center; MacDonald, G.M. [California Univ., Los Angeles, CA (United States). Dept. of Botanical Sciences; Moser, K.A. [McMaster Univ., Hamilton, ON (Canada)

    1996-05-01

    The relationship between conifer stomata and existing vegetation across tundra, forest-tundra, and closed zones in the Yellowknife area of the Northwest Territories was studied. Conifer stomata were identified in surface samples from lakes in the treeline zone, but were absent in samples from tundra lakes. Stomate analysis was recorded and the results were presented in a concentration diagram plotting stomate concentrations according to vegetation zone. Conifer stomate analysis was not able to resolve differences between forest-tundra and closed forest. Nevertheless, it was suggested that stomate analysis will become an important technique supplementing pollen analysis for reconstructing past tree-line changes since the presence of stomata in lakes make it possible to separate the tundra from forest-tundra and closed forest. The limited dispersal of conifer stomata permitted a better resolution of tree-line boundaries than did pollen. 13 refs., 3 figs.

  2. Isotope techniques to identify recharge areas of springs for rainwater harvesting in the mountainous region of Gaucher area, Chamoli district, Uttarakhand

    International Nuclear Information System (INIS)

    Shivanna, K.; Tirumalesh, K.; Noble, J.; Joseph, T.B.; Singh, Gursharan; Joshi, A.P.; Khati, V.S.

    2008-01-01

    Environmental isotope techniques have been employed to identify the recharge areas of springs in India, in order to construct artificial recharge structures for rainwater harvesting and groundwater augmentation for their rejuvenation. A model project was taken up in the mountainous region of Gaucher area, Chamoli District, Uttarakhand for this purpose. The springs in this regions are seasonal and are derived from seepage waters flowing through the shallow weathered and fractured zone. The chemistry of high-altitude springs is similar to that of precipitation, whereas water-rock interactions contributes to increased mineralization in low-altitude springs. The stable isotopic variation in precipitation suggests that the altitude effect for Gaucher area is -0.55% for δ 18 O and -3.8% for δ 2 H per 100 m rise in altitude. Based on local geology, geomorphology, hydrochemistry and isotope information, the possible recharge areas inferred for valleys 1, 2 and 3 are located at altitudes of 1250, 1330 and 1020 m amsl respectively. Water conservation and recharge structures such as subsurface dykes, check bunds and contour trenches were constructed at the identified recharge areas in the respective valleys for controlling the subsurface flow, rainwater harvesting and groundwater augmentation respectively. As a result, during and after the following monsoon, the discharge rates of the springs not only increased significantly, but also did not dry up even during the dry period. The study shows that the isotope techniques can be effectively used in identifying recharge areas of springs in the Himalayan region. It also demonstrates the advantage of isotope techniques over conventional methods. (author)

  3. Uranium solution mining cost estimating technique: means for rapid comparative analysis of deposits

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    Twelve graphs provide a technique for determining relative cost ranges for uranium solution mining projects. The use of the technique can provide a consistent framework for rapid comparative analysis of various properties of mining situations. The technique is also useful to determine the sensitivities of cost figures to incremental changes in mining factors or deposit characteristics

  4. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    Directory of Open Access Journals (Sweden)

    Peeyush Sahay

    2009-10-01

    Full Text Available Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS, cavity ringdown spectroscopy (CRDS, integrated cavity output spectroscopy (ICOS, cavity enhanced absorption spectroscopy (CEAS, cavity leak-out spectroscopy (CALOS, photoacoustic spectroscopy (PAS, quartz-enhanced photoacoustic spectroscopy (QEPAS, and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS. Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis.

  5. Analysis and interpretation of dynamic FDG PET oncological studies using data reduction techniques

    Directory of Open Access Journals (Sweden)

    Santos Andres

    2007-10-01

    Full Text Available Abstract Background Dynamic positron emission tomography studies produce a large amount of image data, from which clinically useful parametric information can be extracted using tracer kinetic methods. Data reduction methods can facilitate the initial interpretation and visual analysis of these large image sequences and at the same time can preserve important information and allow for basic feature characterization. Methods We have applied principal component analysis to provide high-contrast parametric image sets of lower dimensions than the original data set separating structures based on their kinetic characteristics. Our method has the potential to constitute an alternative quantification method, independent of any kinetic model, and is particularly useful when the retrieval of the arterial input function is complicated. In independent component analysis images, structures that have different kinetic characteristics are assigned opposite values, and are readily discriminated. Furthermore, novel similarity mapping techniques are proposed, which can summarize in a single image the temporal properties of the entire image sequence according to a reference region. Results Using our new cubed sum coefficient similarity measure, we have shown that structures with similar time activity curves can be identified, thus facilitating the detection of lesions that are not easily discriminated using the conventional method employing standardized uptake values.

  6. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Matthew W. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include the inherently weak Raman cross section and susceptibility to fluorescence interference.

  7. Using a behaviour change techniques taxonomy to identify active ingredients within trials of implementation interventions for diabetes care.

    Science.gov (United States)

    Presseau, Justin; Ivers, Noah M; Newham, James J; Knittle, Keegan; Danko, Kristin J; Grimshaw, Jeremy M

    2015-04-23

    Methodological guidelines for intervention reporting emphasise describing intervention content in detail. Despite this, systematic reviews of quality improvement (QI) implementation interventions continue to be limited by a lack of clarity and detail regarding the intervention content being evaluated. We aimed to apply the recently developed Behaviour Change Techniques Taxonomy version 1 (BCTTv1) to trials of implementation interventions for managing diabetes to assess the capacity and utility of this taxonomy for characterising active ingredients. Three psychologists independently coded a random sample of 23 trials of healthcare system, provider- and/or patient-focused implementation interventions from a systematic review that included 142 such studies. Intervention content was coded using the BCTTv1, which describes 93 behaviour change techniques (BCTs) grouped within 16 categories. We supplemented the generic coding instructions within the BCTTv1 with decision rules and examples from this literature. Less than a quarter of possible BCTs within the BCTTv1 were identified. For implementation interventions targeting providers, the most commonly identified BCTs included the following: adding objects to the environment, prompts/cues, instruction on how to perform the behaviour, credible source, goal setting (outcome), feedback on outcome of behaviour, and social support (practical). For implementation interventions also targeting patients, the most commonly identified BCTs included the following: prompts/cues, instruction on how to perform the behaviour, information about health consequences, restructuring the social environment, adding objects to the environment, social support (practical), and goal setting (behaviour). The BCTTv1 mapped well onto implementation interventions directly targeting clinicians and patients and could also be used to examine the impact of system-level interventions on clinician and patient behaviour. The BCTTv1 can be used to characterise

  8. Evaluation of syngas production unit cost of bio-gasification facility using regression analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Yangyang; Parajuli, Prem B.

    2011-08-10

    Evaluation of economic feasibility of a bio-gasification facility needs understanding of its unit cost under different production capacities. The objective of this study was to evaluate the unit cost of syngas production at capacities from 60 through 1800Nm 3/h using an economic model with three regression analysis techniques (simple regression, reciprocal regression, and log-log regression). The preliminary result of this study showed that reciprocal regression analysis technique had the best fit curve between per unit cost and production capacity, with sum of error squares (SES) lower than 0.001 and coefficient of determination of (R 2) 0.996. The regression analysis techniques determined the minimum unit cost of syngas production for micro-scale bio-gasification facilities of $0.052/Nm 3, under the capacity of 2,880 Nm 3/h. The results of this study suggest that to reduce cost, facilities should run at a high production capacity. In addition, the contribution of this technique could be the new categorical criterion to evaluate micro-scale bio-gasification facility from the perspective of economic analysis.

  9. A Comparative Analysis of Techniques for PAPR Reduction of OFDM Signals

    Directory of Open Access Journals (Sweden)

    M. Janjić

    2014-06-01

    Full Text Available In this paper the problem of high Peak-to-Average Power Ratio (PAPR in Orthogonal Frequency-Division Multiplexing (OFDM signals is studied. Besides describing three techniques for PAPR reduction, SeLective Mapping (SLM, Partial Transmit Sequence (PTS and Interleaving, a detailed analysis of the performances of these techniques for various values of relevant parameters (number of phase sequences, number of interleavers, number of phase factors, number of subblocks depending on applied technique, is carried out. Simulation of these techniques is run in Matlab software. Results are presented in the form of Complementary Cumulative Distribution Function (CCDF curves for PAPR of 30000 randomly generated OFDM symbols. Simulations are performed for OFDM signals with 32 and 256 subcarriers, oversampled by a factor of 4. A detailed comparison of these techniques is made based on Matlab simulation results.

  10. Advanced analysis technique for the evaluation of linear alternators and linear motors

    Science.gov (United States)

    Holliday, Jeffrey C.

    1995-01-01

    A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.

  11. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  12. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  13. Identifying Pre-Seismic TIR Anomalies: A Long Term (2004-2015) Of RST Analysis Over Turkish Area

    Science.gov (United States)

    Perrone, A.; Tramutoli, V.; Corrado, A.; Filizzola, C.; Genzano, N.; Lisi, M.; Paciello, R.; Pergola, N.

    2017-12-01

    Since eighties, fluctuations of Earth's thermally emitted radiation, measured by satellite sensors operating in the thermal infrared (TIR) spectral range (i.e. 10-12 µm), have been associated with the complex process of preparation of earthquakes. Several theories have been proposed to explain their origin and their space-time evolution. In this paper, the Earth's emitted radiation in the Thermal Infra-Red spectral region is considered for its possible correlation with M≥4 earthquakes occurred in Turkey in between 2004 and 2015. Robust Satellite Technique (RST) and RETIRA (Robust Estimator of TIR Anomalies) index were used to preliminarily define, and then to identify, Significant Sequences of TIR Anomalies (SSTAs) in the period 1 April 2004- 31 October 2015 (12 years) of daily TIR images acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite. The performed analysis shows that more than 67% of all identified SSTAs occur in the pre-fixed space-time window around the occurrence time and location of earthquakes (M≥4), with a false positive rate smaller than 33%. Moreover, Molchan error diagram analysis gave a clear indication of non-casualty of such a correlation, in comparison with the random guess function. Notwithstanding the huge amount of missed events due to frequent space/time data gaps produced by the presence of clouds over the scene the achieved results, and particularly the low rate of false positives registered on a so long testing period, seems sufficient (at least) to qualify TIR anomalies (identified by RST approach and RETIRA index) among the parameters to be considered in the framework of a multi-parametric approach to time-Dependent Assessment of Seismic Hazard (t-DASH).

  14. Metabolomic analysis using porcine skin: a pilot study of analytical techniques

    OpenAIRE

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-01-01

    Background: Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. Objectives: We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. ...

  15. BENCHMARKING - PRACTICAL TOOLS IDENTIFY KEY SUCCESS FACTORS

    Directory of Open Access Journals (Sweden)

    Olga Ju. Malinina

    2016-01-01

    Full Text Available The article gives a practical example of the application of benchmarking techniques. The object of study selected fashion store Company «HLB & M Hennes & Mauritz», located in the shopping center «Gallery», Krasnodar. Hennes & Mauritz. The purpose of this article is to identify the best ways to develop a fashionable brand clothing store Hennes & Mauritz on the basis of benchmarking techniques. On the basis of conducted market research is a comparative analysis of the data from different perspectives. The result of the author’s study is a generalization of the ndings, the development of the key success factors that will allow to plan a successful trading activities in the future, based on the best experience of competitors.

  16. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Jondhale, K C; Waghmare, L M

    2010-01-01

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  17. Correlation between histological outcome and surgical cartilage repair technique in the knee: A meta-analysis.

    Science.gov (United States)

    DiBartola, Alex C; Everhart, Joshua S; Magnussen, Robert A; Carey, James L; Brophy, Robert H; Schmitt, Laura C; Flanigan, David C

    2016-06-01

    Compare histological outcomes after microfracture (MF), autologous chondrocyte implantation (ACI), and osteochondral autograft transfer (OATS). Literature review using PubMed MEDLINE, SCOPUS, Cumulative Index for Nursing and Allied Health Literature (CINAHL), and Cochrane Collaboration Library. Inclusion criteria limited to English language studies International Cartilage Repair Society (ICRS) grading criteria for cartilage analysis after ACI (autologous chondrocyte implantation), MF (microfracture), or OATS (osteochondral autografting) repair techniques. Thirty-three studies investigating 1511 patients were identified. Thirty evaluated ACI or one of its subtypes, six evaluated MF, and seven evaluated OATS. There was no evidence of publication bias (Begg's p=0.48). No statistically significant correlation was found between percent change in clinical outcome and percent biopsies showing ICRS Excellent scores (R(2)=0.05, p=0.38). Percent change in clinical outcome and percent of biopsies showing only hyaline cartilage were significantly associated (R(2)=0.24, p=0.024). Mean lesion size and histological outcome were not correlated based either on percent ICRS Excellent (R(2)=0.03, p=0.50) or percent hyaline cartilage only (R(2)=0.01, p=0.67). Most common lesion location and histological outcome were not correlated based either on percent ICRS Excellent (R(2)=0.03, p=0.50) or percent hyaline cartilage only (R(2)=0.01, p=0.67). Microfracture has poorer histologic outcomes than other cartilage repair techniques. OATS repairs primarily are comprised of hyaline cartilage, followed closely by cell-based techniques, but no significant difference was found cartilage quality using ICRS grading criteria among OATS, ACI-C, MACI, and ACI-P. IV, meta-analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Analysis of Cell Phone Usage Using Correlation Techniques

    OpenAIRE

    T S R MURTHY; D. SIVA RAMA KRISHNA

    2011-01-01

    The present paper is a sample survey analysis, examined based on correlation techniques. The usage ofmobile phones is clearly almost un-avoidable these days and as such the authors have made a systematicsurvey through a well prepared questionnaire on making use of mobile phones to the maximum extent.These samples are various economical groups across a population of over one-lakh people. The resultsare scientifically categorized and interpreted to match the ground reality.

  19. Comparative Analysis and Modification of Imaging Techniques in the Parametric Studies of Control Systems

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2017-01-01

    Full Text Available Bauman Moscow State Technical University, MoscowКалининградский государственный технический университет, КалининградThe article considers practical application aspects of various imaging techniques for parametric analysis of control systems. It is interpreted as a multivariate analysis aimed at studying the influence of external and internal parameters of the system on the quality of its functioning determined by direct and indirect quality criteria. The ultimate goal is to identify regions in the parameter space to provide an appropriate quality of the system. It is noted that visualization is a very important task-supporting aid for a designer to make decision. Stressed that the problem of parametric studies, in most cases, intersects with the major problem of inconsistency of separate partial criteria, i.e., the problem of multi-criteria optimization (MCO. Therefore, the aim of the article was to solve a joint task of visualization and multi-criteria optimization.The article considers traditional types of visualization in deterministic tasks (a 3-D graphics, plotting contour lines, gradient fields, Andrews curves, parallel coordinates, as well as methods used in statistics (graph matrices, etc.. Testing these methods as applied to the practical task of studying a double-circuit stabilization system allowed formulation of requirements for imaging techniques with multiple criteria.Provides a new perspective on using the traditional means to significantly enhance information capacity of imaging. A generic method of visualization is represented as a three-phase study agreed with the task of finding the compromise solutions. The first phase of the research involved identifying the monotony and contra-monotony domains. The second phase was aimed at identifying a region of compromise, or a weak Pareto optimum. The third phase involved the search for a consistent optimum (strong Pareto

  20. Different techniques of multispectral data analysis for vegetation fraction retrieval

    Science.gov (United States)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  1. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  2. Current applications of miniaturized chromatographic and electrophoretic techniques in drug analysis.

    Science.gov (United States)

    Aturki, Zeineb; Rocco, Anna; Rocchi, Silvia; Fanali, Salvatore

    2014-12-01

    In the last decade, miniaturized separation techniques have become greatly popular in pharmaceutical analysis. Miniaturized separation methods are increasingly utilized in all processes of drug discovery as well as quality control of pharmaceutical preparation. The great advantages presented by the analytical miniaturized techniques, including high separation efficiency and resolution, rapid analysis and minimal consumption of reagents and samples, make them an attractive alternative to the conventional chromatographic methods for drug analysis. The purpose of this review is to give a general overview of the applicability of capillary electrophoresis (CE), capillary electrochromatography (CEC) and micro/capillary/nano-liquid chromatography (micro-LC/CLC/nano-LC) for the analysis of pharmaceutical formulations, active pharmaceutical ingredients (API), drug impurity testing, chiral drug separation, determination of drugs and metabolites in biological fluids. The results concerning the use of CEC, micro-LC, CLC, and nano-LC in the period 2009-2013, while for CE, those from 2012 up to the review draft are here summarized and some specific examples are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Quantitative chemical analysis of lead in canned chillis by spectrophotometric and nuclear techniques

    International Nuclear Information System (INIS)

    Sanchez Paz, L.A.

    1991-01-01

    The objectives of this work are the quantification of lead contents in two types of canned chilli of three trademarks, determining its inside of maximum permissible level (2 ppm), comparing moreover two trademarks that have flask and canned presentation for to determine the filling effect in the final content of lead, moreover make a comparative study of the techniques using on base to exactitude, linearity and sensibility. The techniques used were atomic absorption spectrophotometry, plasma emission spectrometry and x-ray fluorescence. The preliminary treatment of the samples was by calcination, continued of the ashes dissolution in acid medium, for later gauge a determinate volume for analyze by atomic absorption and plasma emission. For the analysis by x-ray fluorescence, after solubilyzing ashes, its precipitate the lead with PCDA (Pyrrolidine carbodithioic ammonium acid) then its filtered, filter paper is dried and counted directly. The standards preparation is made following the same procedure as in samples using lead titrisol solution. For each technique the recovery percent is determined by the addition of enough know amount. For each technique calibration curves are plotted been determined that the three are lineal in the established range of work. The recovery percent in three cases is superior to ninety five percent. By means of a variance analysis it was determined that lead contain in samples do not exceed two ppm., and the lead content in canned chillis is superior to contained in glass containers (1.7, 0.4 ppm respectively). X-ray fluorescence analysis is different to the attained results by the other two techniques due to its sensibility is less. The most advisable techniques for this kind of analysis are atomic absorption spectrophotometry and plasma emission. (Author)

  4. A novel graphical technique for Pinch Analysis applications: Energy Targets and grassroots design

    International Nuclear Information System (INIS)

    Gadalla, Mamdouh A.

    2015-01-01

    Graphical abstract: A new HEN graphical design. - Highlights: • A new graphical technique for heat exchanger networks design. • Pinch Analysis principles and design rules are better interpreted. • Graphical guidelines for optimum heat integration. • New temperature-based graphs provide user-interactive features. - Abstract: Pinch Analysis is for decades a leading tool to energy integration for retrofit and design. This paper presents a new graphical technique, based on Pinch Analysis, for the grassroots design of heat exchanger networks. In the new graph, the temperatures of hot streams are plotted versus those of the cold streams. The temperature–temperature based graph is constructed to include temperatures of hot and cold streams as straight lines, horizontal lines for hot streams, and vertical lines for cold streams. The graph is applied to determine the pinch temperatures and Energy Targets. It is then used to synthesise graphically a complete exchanger network, achieving the Energy Targets. Within the new graph, exchangers are represented by inclined straight lines, whose slopes are proportional to the ratio of heat capacities and flows. Pinch Analysis principles for design are easily interpreted using this new graphical technique to design a complete exchanger network. Network designs achieved by the new technique can guarantee maximum heat recovery. The new technique can also be employed to simulate basic designs of heat exchanger networks. The strengths of the new tool are that it is simply applied using computers, requires no commercial software, and can be used for academic purposes/engineering education

  5. Analysis and modification of blue sapphires from Rwanda by ion beam techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bootkul, D., E-mail: mo_duangkhae@hotmail.com [Department of General Science - Gems & Jewelry, Faculty of Science, Srinakharinwirot University, Bangkok 10110 (Thailand); Chaiwai, C.; Tippawan, U. [Plasma and Beam Physics Research Facility, Department of Physics and Materials Science, Faculty of Science, Chiang Mai University, Chiang Mai 50200 (Thailand); Wanthanachaisaeng, B. [Gems Enhancement Research Unit, Faculty of Gems, Burapha University, Chanthaburi Campus, Chanthaburi 22170 (Thailand); Intarasiri, S., E-mail: saweat@gmail.com [Science and Technology Research Institute, Chiang Mai University, Chiang Mai 50200 (Thailand)

    2015-12-15

    Highlights: • Ion beam analysis is an effective method for detecting trace elements. • Ion beam treatment is able to improve optical and color appearances of the blue sapphire from Rwanda. • These alternative methods can be extended to jewelry industry for large scale application. - Abstract: Blue sapphire is categorised in a corundum (Al{sub 2}O{sub 3}) group. The gems of this group are always amazed by their beauties and thus having high value. In this study, blue sapphires from Rwanda, recently came to Thai gemstone industry, are chosen for investigations. On one hand, we have applied Particle Induced X-ray Emission (PIXE), which is a highly sensitive and precise analytical technique that can be used to identify and quantify trace elements, for chemical analysis of the sapphires. Here we have found that the major element of blue sapphires from Rwanda is Al with trace elements such as Fe, Ti, Cr, Ga and Mg as are commonly found in normal blue sapphire. On the other hand, we have applied low and medium ion implantations for color improvement of the sapphire. It seems that a high amount of energy transferring during cascade collisions have altered the gems properties. We have clearly seen that the blue color of the sapphires have been intensified after nitrogen ion bombardment. In addition, the gems were also having more transparent and luster. The UV–Vis–NIR measurement detected the modification of their absorption properties, implying of the blue color increasing. Here the mechanism of these modifications is postulated and reported. In any point of view, the bombardment by using nitrogen ion beam is a promising technique for quality improvement of the blue sapphire from Rwanda.

  6. Behaviour Change Techniques embedded in health and lifestyle apps: coding and analysis.

    Directory of Open Access Journals (Sweden)

    Gaston Antezana

    2015-09-01

    Full Text Available Background There is evidence showing that commercially available health and lifestyle apps can be used as co-adjuvants to clinical interventions and for the prevention of chronic and non-communicable diseases. This can be particularly significant to support and improve wellbeing of young people given their familiarity with these resources. However it is important to understand the content and consistency of Behaviour Change Techniques (BCT’s embedded in the apps to maximise their potential benefits. Objectives This study explores the BCT content of a selected list of health and lifestyle tracking apps in three behavioural dimensions: physical activity, sleep and diet. We identified BCT commonalities within and between categories to detect the most frequently used and arguably more effective techniques in the context of wellbeing and promotion of health behaviours. Methods Apps were selected by using keywords and by reviewing the “health and fitness” category of GooglePlay (477 apps. The selection criteria included free apps (even if they also offered paid versions and being common to GooglePlay and AppStore. A background review of each app was also completed. Selected apps were classified according to user ratings in GooglePlay (apps with less that 4+ star ratings were disregarded. The top ten apps in each category were selected, making it a total of 30 for the analysis. Three coders used the apps for two months and were trained to use a comprehensive 93 items taxonomy (BCTv1 to complete the analysis. Results Strong BCT similarities were found across all three categories, suggesting a consistent basic content composition. Out of all 93 BCTS’s 8 were identified as being present in at least 50% of the apps. 6 of these BCT’s are concentrated in categories “1. Goals and Planning” and “2. Feedback and Monitoring”. BCT “Social support (unspecified” was coded for in 63% of the apps, as it was present through different features in

  7. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  8. Techniques and methodologies to identify potential generated industries of NORM in Angola Republic and evaluate its impacts

    International Nuclear Information System (INIS)

    Diogo, José Manuel Sucumula

    2017-01-01

    Numerous steps have been taken worldwide to identify and quantify the radiological risks associated with the mining of ores containing Naturally Occurrence Radioactive Material (NORM), often resulting in unnecessary exposures to individuals and high environmental damage, with devastating consequences for the health of workers and damage to the economy of many countries due to a lack of regulations or inadequate regulations. For these and other reasons, the objective of this work was to identify industrial potential generating NORM in the Republic of Angola and to estimate its radiological environmental impacts. To achieve this objective, we studied the theoretical aspects, identified the main internationally recognized industrial companies that as generate by NORM. The Brazilian experience in the regulatory aspect was observed in the evaluation criteria to classify industries that generate NORM, the methods of mining and its radiological environmental impacts, as well as the main techniques applied to evaluate the concentrations of radionuclides in a specific environmental matrix and/or a NORM sample. The study approach allowed the elaboration of a NORM map for the main provinces of Angola, establishing the evaluation criteria for implementing the Radiation Protection Plan in the extractive industry, establishing measures to control ionizing radiation in mining, identifying and quantifying radionuclides present in samples of lees oil. However, in order to assess adequately the radiological environmental impact of the NORM industry, it is not enough to identify them, it is important to know the origin, quantify the radioactive material released as liquid and gaseous effluents, identify the main routes of exposure and examine how this material spreads into the environment until it reaches man. (author)

  9. Network analysis of translocated Takahe populations to identify disease surveillance targets.

    Science.gov (United States)

    Grange, Zoë L; VAN Andel, Mary; French, Nigel P; Gartrell, Brett D

    2014-04-01

    Social network analysis is being increasingly used in epidemiology and disease modeling in humans, domestic animals, and wildlife. We investigated this tool in describing a translocation network (area that allows movement of animals between geographically isolated locations) used for the conservation of an endangered flightless rail, the Takahe (Porphyrio hochstetteri). We collated records of Takahe translocations within New Zealand and used social network principles to describe the connectivity of the translocation network. That is, networks were constructed and analyzed using adjacency matrices with values based on the tie weights between nodes. Five annual network matrices were created using the Takahe data set, each incremental year included records of previous years. Weights of movements between connected locations were assigned by the number of Takahe moved. We calculated the number of nodes (i(total)) and the number of ties (t(total)) between the nodes. To quantify the small-world character of the networks, we compared the real networks to random graphs of the equivalent size, weighting, and node strength. Descriptive analysis of cumulative annual Takahe movement networks involved determination of node-level characteristics, including centrality descriptors of relevance to disease modeling such as weighted measures of in degree (k(i)(in)), out degree (k(i)(out)), and betweenness (B(i)). Key players were assigned according to the highest node measure of k(i)(in), k(i)(out), and B(i) per network. Networks increased in size throughout the time frame considered. The network had some degree small-world characteristics. Nodes with the highest cumulative tie weights connecting them were the captive breeding center, the Murchison Mountains and 2 offshore islands. The key player fluctuated between the captive breeding center and the Murchison Mountains. The cumulative networks identified the captive breeding center every year as the hub of the network until the final

  10. Multi-criterion analysis technique in a process of quality management

    OpenAIRE

    A. Gwiazda

    2007-01-01

    Purpose: The aim of this paper is to present the critical analysis of some multi-criteria techniques applied in the area of quality management. It is strongly stated that some solutions in this scientific area characterizes the non-methodological approaches.Design/methodology/approach: The research methodology, in presented work, has been based on the theoretical analysis of the quality tools management and on the empirical researches.Findings: The proposals of improvement the main quality to...

  11. Gene expression meta-analysis identifies metastatic pathways and transcription factors in breast cancer

    International Nuclear Information System (INIS)

    Thomassen, Mads; Tan, Qihua; Kruse, Torben A

    2008-01-01

    Metastasis is believed to progress in several steps including different pathways but the determination and understanding of these mechanisms is still fragmentary. Microarray analysis of gene expression patterns in breast tumors has been used to predict outcome in recent studies. Besides classification of outcome, these global expression patterns may reflect biological mechanisms involved in metastasis of breast cancer. Our purpose has been to investigate pathways and transcription factors involved in metastasis by use of gene expression data sets. We have analyzed 8 publicly available gene expression data sets. A global approach, 'gene set enrichment analysis' as well as an approach focusing on a subset of significantly differently regulated genes, GenMAPP, has been applied to rank pathway gene sets according to differential regulation in metastasizing tumors compared to non-metastasizing tumors. Meta-analysis has been used to determine overrepresentation of pathways and transcription factors targets, concordant deregulated in metastasizing breast tumors, in several data sets. The major findings are up-regulation of cell cycle pathways and a metabolic shift towards glucose metabolism reflected in several pathways in metastasizing tumors. Growth factor pathways seem to play dual roles; EGF and PDGF pathways are decreased, while VEGF and sex-hormone pathways are increased in tumors that metastasize. Furthermore, migration, proteasome, immune system, angiogenesis, DNA repair and several signal transduction pathways are associated to metastasis. Finally several transcription factors e.g. E2F, NFY, and YY1 are identified as being involved in metastasis. By pathway meta-analysis many biological mechanisms beyond major characteristics such as proliferation are identified. Transcription factor analysis identifies a number of key factors that support central pathways. Several previously proposed treatment targets are identified and several new pathways that may

  12. Nonlinear analysis techniques for use in the assessment of high-level waste tank structures

    International Nuclear Information System (INIS)

    Moore, C.J.; Julyk, L.J.; Fox, G.L.; Dyrness, A.D.

    1991-01-01

    Reinforced concrete in combination with a steel liner has had a wide application to structures containing hazardous material. The buried double-shell waste storage tanks at the US Department of Energy's Hanford Site use this construction method. The generation and potential ignition of combustible gases within the primary tank is postulated to develop beyond-design-basis internal pressure and possible impact loading. The scope of this paper includes the illustration of analysis techniques for the assessment of these beyond-design-basis loadings. The analysis techniques include the coupling of the gas dynamics with the structural response, the treatment of reinforced concrete in regimes of inelastic behavior, and the treatment of geometric nonlinearities. The techniques and software tools presented provide a powerful nonlinear analysis capability for storage tanks

  13. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  14. The Systems Approach to Functional Job Analysis. Task Analysis of the Physician's Assistant: Volume I--Task Analysis Methodology and Techniques.

    Science.gov (United States)

    Wake Forest Univ., Winston Salem, NC. Bowman Gray School of Medicine.

    Utilizing a systematic sampling technique, the professional activities of small groups of pediatricians, family practitioners, surgeons, obstetricians, and internists were observed for 4 or 5 days by a medical student who checked a prearranged activity sheet every 30 seconds to: (1) identify those tasks and activities an assistant could be trained…

  15. Analysis of Surface Water Pollution in the Kinta River Using Multivariate Technique

    International Nuclear Information System (INIS)

    Hamza Ahmad Isiyaka; Hafizan Juahir

    2015-01-01

    This study aims to investigate the spatial variation in the characteristics of water quality monitoring sites, identify the most significant parameters and the major possible sources of pollution, and apportion the source category in the Kinta River. 31 parameters collected from eight monitoring sites for eight years (2006-2013) were employed. The eight monitoring stations were spatially grouped into three independent clusters in a dendrogram. A drastic reduction in the number of monitored parameters from 31 to eight and nine significant parameters (P<0.05) was achieved using the forward stepwise and backward stepwise discriminate analysis (DA). Principal component analysis (PCA) accounted for more than 76 % in the total variance and attributes the source of pollution to anthropogenic and natural processes. The source apportionment using a combined multiple linear regression and principal component scores indicates that 41 % of the total pollution load is from rock weathering and untreated waste water, 26 % from waste discharge, 24 % from surface runoff and 7 % from faecal waste. This study proposes a reduction in the number of monitoring stations and parameters for a cost effective and time management in the monitoring processes and multivariate technique can provide a simple representation of complex and dynamic water quality characteristics. (author)

  16. Use of electron spin resonance technique for identifying of irradiated foods

    Energy Technology Data Exchange (ETDEWEB)

    El-Shiemy, S M E

    2008-07-01

    The present investigation was carried out to establish the electron spin resonance (ESR) technique for identifying of some irradiated foodstuffs, i.e. dried fruits (fig and raisin), nuts (almond and pistachio) and spices (fennel and thyme). Gamma rays were used as follows: 0, 1, 3 and 5 kGy were given for dried fruits, while 0, 2, 4 and 6 kGy were given for nuts. In addition, 0, 5, 10 and 15 kGy were given for spices. All treatments were stored at room temperature (25{+-}2 degree C) for six months to study the possibility of detecting its irradiation treatment by ESR spectroscopy. The obtained results indicated that ESR signal intensities of all irradiated samples were markedly increased correspondingly with irradiation dose as a result of free radicals generated by gamma irradiation. So, all irradiated samples under investigation could be differentiated from unirradiated ones immediately after irradiation treatment. The decay that occur in free radicals which responsible of ESR signals during storage periods at ambient temperature showed a significant minimize in ESR signal intensities of irradiated samples. Therefore, after six months of ambient storage the detection was easily possible for irradiated dried fig with dose {>=} 3 kGy and for all irradiated raisin and pistachio (shell). Also, it was possible for irradiated fennel with dose {>=} 10 kGy and for irradiated thyme with dose {>=}15 kGy. In contrast, the identification of all irradiated samples of almond (shell as well as edible part) and pistachio (edible part) was impossible after six months of ambient storage.

  17. Use of electron spin resonance technique for identifying of irradiated foods

    International Nuclear Information System (INIS)

    El-Shiemy, S.M.E

    2008-01-01

    The present investigation was carried out to establish the electron spin resonance (ESR) technique for identifying of some irradiated foodstuffs, i.e. dried fruits (fig and raisin), nuts (almond and pistachio) and spices (fennel and thyme). Gamma rays were used as follows: 0, 1, 3 and 5 kGy were given for dried fruits, while 0, 2, 4 and 6 kGy were given for nuts. In addition, 0, 5, 10 and 15 kGy were given for spices. All treatments were stored at room temperature (25±2 degree C) for six months to study the possibility of detecting its irradiation treatment by ESR spectroscopy. The obtained results indicated that ESR signal intensities of all irradiated samples were markedly increased correspondingly with irradiation dose as a result of free radicals generated by gamma irradiation. So, all irradiated samples under investigation could be differentiated from unirradiated ones immediately after irradiation treatment. The decay that occur in free radicals which responsible of ESR signals during storage periods at ambient temperature showed a significant minimize in ESR signal intensities of irradiated samples. Therefore, after six months of ambient storage the detection was easily possible for irradiated dried fig with dose ≥ 3 kGy and for all irradiated raisin and pistachio (shell). Also, it was possible for irradiated fennel with dose ≥ 10 kGy and for irradiated thyme with dose ≥15 kGy. In contrast, the identification of all irradiated samples of almond (shell as well as edible part) and pistachio (edible part) was impossible after six months of ambient storage.

  18. Analysis of corrosion-product transport using nondestructive XRF and MS techniques

    International Nuclear Information System (INIS)

    Sawicka, B.D.; Sawicki, J.A.

    1998-01-01

    This paper describes the application of X-ray fluorescence (XRF) and Moessbauer spectroscopy (MS) techniques to monitor corrosion-product transport (CPT) in water circuits of nuclear reactors. The combination of XRF and MS techniques was applied in studies of CPT crud filters from both primary- and secondary-side water circuits (i.e., radioactive and nonradioactive specimens) of CANDU reactors. The XRF-MS method allows nondestructive analysis of species collected on filters and provides more complete information about corrosion products than commonly used digestive methods of chemical analysis. Recent analyses of CPT specimens from the Darlington Nuclear Generating Station (NGS) primary side and the Bruce B NGS feedwater system are shown as examples. Some characteristics of primary and secondary water circuits are discussed using these new data. (author)

  19. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    Science.gov (United States)

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  20. Automated local bright feature image analysis of nuclear protein distribution identifies changes in tissue phenotype

    International Nuclear Information System (INIS)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-01-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues

  1. Depth profile analysis of thin TiOxNy films using standard ion beam analysis techniques and HERDA

    International Nuclear Information System (INIS)

    Markwitz, A.; Dytlewski, N.; Cohen, D.

    1999-01-01

    Ion beam assisted deposition is used to fabricate thin titanium oxynitride films (TiO x N y ) at Industrial Research (typical film thickness 100nm). At the Institute of Geological and Nuclear Sciences, the thin films are analysed using non-destructive standard ion beam analysis (IBA) techniques. High-resolution titanium depth profiles are measured with RBS using 1.5MeV 4 He + ions. Non-resonant nuclear reaction analysis (NRA) is performed for investigating the amounts of O and N in the deposited films using the reactions 16 O(d,p) 17 O at 920 keV and 14 N(d,α) 12 C at 1.4 MeV. Using a combination of these nuclear techniques, the stoichiometry as well as the thickness of the layers is revealed. However, when oxygen and nitrogen depth profiles are required for investigating stoichiometric changes in the films, additional nuclear analysis techniques such as heavy ion elastic recoil detection (HERDA) have to be applied. With HERDA, depth profiles of N, O, and Ti are measured simultaneously. In this paper comparative IBA measurement s of TiO x N y films with different compositions are presented and discussed

  2. Proteogenomic Analysis Identifies a Novel Human SHANK3 Isoform

    Directory of Open Access Journals (Sweden)

    Fahad Benthani

    2015-05-01

    Full Text Available Mutations of the SHANK3 gene have been associated with autism spectrum disorder. Individuals harboring different SHANK3 mutations display considerable heterogeneity in their cognitive impairment, likely due to the high SHANK3 transcriptional diversity. In this study, we report a novel interaction between the Mutated in colorectal cancer (MCC protein and a newly identified SHANK3 protein isoform in human colon cancer cells and mouse brain tissue. Hence, our proteogenomic analysis identifies a new human long isoform of the key synaptic protein SHANK3 that was not predicted by the human reference genome. Taken together, our findings describe a potential new role for MCC in neurons, a new human SHANK3 long isoform and, importantly, highlight the use of proteomic data towards the re-annotation of GC-rich genomic regions.

  3. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  4. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  5. Cluster Analysis of Clinical Data Identifies Fibromyalgia Subgroups

    Science.gov (United States)

    Docampo, Elisa; Collado, Antonio; Escaramís, Geòrgia; Carbonell, Jordi; Rivera, Javier; Vidal, Javier; Alegre, José

    2013-01-01

    Introduction Fibromyalgia (FM) is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. Material and Methods 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. Results Variables clustered into three independent dimensions: “symptomatology”, “comorbidities” and “clinical scales”. Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1), high symptomatology and comorbidities (Cluster 2), and high symptomatology but low comorbidities (Cluster 3), showing differences in measures of disease severity. Conclusions We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment. PMID:24098674

  6. Cluster analysis of clinical data identifies fibromyalgia subgroups.

    Directory of Open Access Journals (Sweden)

    Elisa Docampo

    Full Text Available INTRODUCTION: Fibromyalgia (FM is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. MATERIAL AND METHODS: 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. RESULTS: VARIABLES CLUSTERED INTO THREE INDEPENDENT DIMENSIONS: "symptomatology", "comorbidities" and "clinical scales". Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1, high symptomatology and comorbidities (Cluster 2, and high symptomatology but low comorbidities (Cluster 3, showing differences in measures of disease severity. CONCLUSIONS: We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment.

  7. Testing of the derivative method and Kruskal-Wallis technique for sensitivity analysis of SYVAC

    International Nuclear Information System (INIS)

    Prust, J.O.; Edwards, H.H.

    1985-04-01

    The Kruskal-Wallis method of one-way analysis of variance by ranks has proved successful in identifying input parameters which have an important influence on dose. This technique was extended to test for first order interactions between parameters. In view of a number of practical difficulties and the computing resources required to carry out a large number of runs, this test is not recommended for detecting interactions between parameters. The derivative method of sensitivity analysis examines the partial derivative values of each input parameter with dose at various points across the parameter range. Important input parameters are associated with high derivatives and the results agreed well with previous sensitivity studies. The derivative values also provided information on the data generation distributions to be used for the input parameters in order to concentrate sampling in the high dose region of the parameter space to improve the sampling efficiency. Furthermore, the derivative values provided information on parameter interactions, the feasibility of developing a high dose algorithm and formed the basis for developing a regression equation. (author)

  8. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Supersonic Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2011-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. Assessing the blade structural integrity is a complex task requiring an initial characterization of whether resonance is possible and then performing a forced response analysis if that condition is met. The standard technique for forced response analysis in rocket engines is to decompose a CFD-generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. A substantial effort has been made to account for this denser spatial Fourier content in frequency response analysis (described in another paper by the author), but the question still remains whether the frequency response analysis itself is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, of bladed-disks undergoing this complex flow environment have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. Six loading cases were generated by varying a baseline harmonic excitation in different ways based upon cold-flow testing from Heritage Fuel Air Turbine Test. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. It was hypothesized that enforcing periodicity in the CFD (inherent in the frequency response technique) would overestimate the

  9. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    Science.gov (United States)

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  10. Process sensors characterization based on noise analysis technique and artificial intelligence

    International Nuclear Information System (INIS)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos

    2005-01-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  11. Process sensors characterization based on noise analysis technique and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: rnavarro@ipen.br; sperillo@ipen.br; rcsantos@ipen.br

    2005-07-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  12. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  13. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori......The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done...

  14. Displaced spectra techniques as a tool for peak identification in PSD-analysis

    International Nuclear Information System (INIS)

    Pineyro, J.; Behringer, K.

    1987-10-01

    Sharp peaks in the power spectral density function can be due to periodic components in the noise signal or due to narrowband random contributions. A novel method based on Fourier transform techniques is presented which allows under certain limitations to identify the peak type. (author)

  15. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  16. Techniques for the quantitative analysis of fission-product noble metals

    International Nuclear Information System (INIS)

    Lautensleger, A.W.; Hara, F.T.

    1982-08-01

    Analytical procedures for the determination of ruthenium, rhodium, and palladium in precursor waste, solvent metal, and final glass waste forms have been developed. Two procedures for the analysis of noble metals in the calcine and glass waste forms are described in this report. The first is a fast and simple technique that combines inductively coupled argon plasma atomic emission spectrometry (ICP) and x-ray fluorescence techniques and can only be used on nonradioactive materials. The second procedure is based on a noble metal separation step, followed by an analysis using ICP. This second method is more complicated than the first, but it will work on radioactive materials. Also described is a procedure for the ICP analysis of noble metals in the solvent metal matrix. The only solvent metal addressed in this procedure is lead, but with minor changes the procedure could be applied to any of the solvent metals being considered in the Pacific Northwest Laboratory (PNL) extraction process. A brief explanation of atomic spectroscopy and the ICP analytical process, as well as of certain aspects of ICP performance (interelement spectral line interferences and certain matrix effects) is given

  17. Fault Tree Analysis with Temporal Gates and Model Checking Technique for Qualitative System Safety Analysis

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2010-01-01

    Fault tree analysis (FTA) has suffered from several drawbacks such that it uses only static gates and hence can not capture dynamic behaviors of the complex system precisely, and it is in lack of rigorous semantics, and reasoning process which is to check whether basic events really cause top events is done manually and hence very labor-intensive and time-consuming for the complex systems while it has been one of the most widely used safety analysis technique in nuclear industry. Although several attempts have been made to overcome this problem, they can not still do absolute or actual time modeling because they adapt relative time concept and can capture only sequential behaviors of the system. In this work, to resolve the problems, FTA and model checking are integrated to provide formal, automated and qualitative assistance to informal and/or quantitative safety analysis. Our approach proposes to build a formal model of the system together with fault trees. We introduce several temporal gates based on timed computational tree logic (TCTL) to capture absolute time behaviors of the system and to give concrete semantics to fault tree gates to reduce errors during the analysis, and use model checking technique to automate the reasoning process of FTA

  18. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  19. Service Interaction Flow Analysis Technique for Service Personalization

    DEFF Research Database (Denmark)

    Korhonen, Olli; Kinnula, Marianne; Syrjanen, Anna-Liisa

    2017-01-01

    Service interaction flows are difficult to capture, analyze, outline, and represent for research and design purposes. We examine how variation of personalized service flows in technology-mediated service interaction can be modeled and analyzed to provide information on how service personalization...... could support interaction. We have analyzed service interaction cases in a context of technology-mediated car rental service. With the analysis technique we propose, inspired by Interaction Analysis method, we were able to capture and model the situational service interaction. Our contribution regarding...... technology-mediated service interaction design is twofold: First, with the increased understanding on the role of personalization in managing variation in technology-mediated service interaction, our study contributes to designing service management information systems and human-computer interfaces...

  20. Biokinematic structure of techniques wrestlers during pre-basic training

    Directory of Open Access Journals (Sweden)

    S.V. Sinіgovets

    2013-07-01

    Full Text Available The theoretical aspects of freestyle wrestlers. Experimentally investigated the structural elements of techniques during pre-basic training. The study involved 28 young fighters. Held video computer analysis techniques. Identified biomechanical characteristics defined kinematic structure of the temporal and spatial-temporal characteristics of the basic techniques. Shown variability of the individual phases of the basic techniques. Structural dynamics of the resulting velocities of the individual body bioelement fighters showed characteristic changes depending on the mode and direction of the motor action. Found that the predominant contribution to the biokinematic structure of technical actions were resulting velocities torso of young fighters.

  1. A technique to identify annual growth rings in Eucalyptus grandis using annual measurements of diameter at breast height and gamma ray densitometry

    CSIR Research Space (South Africa)

    Naidoo, Sasha

    2010-06-01

    Full Text Available A technique was developed to identify annual growth rings in E. grandis using a combination of annual measurements of diameter at breast height (DBH) from permanent sample plot (PSP) datasets and bark-pith density profiles. By assessing the pattern...

  2. A comparison of autonomous techniques for multispectral image analysis and classification

    Science.gov (United States)

    Valdiviezo-N., Juan C.; Urcid, Gonzalo; Toxqui-Quitl, Carina; Padilla-Vivanco, Alfonso

    2012-10-01

    Multispectral imaging has given place to important applications related to classification and identification of objects from a scene. Because of multispectral instruments can be used to estimate the reflectance of materials in the scene, these techniques constitute fundamental tools for materials analysis and quality control. During the last years, a variety of algorithms has been developed to work with multispectral data, whose main purpose has been to perform the correct classification of the objects in the scene. The present study introduces a brief review of some classical as well as a novel technique that have been used for such purposes. The use of principal component analysis and K-means clustering techniques as important classification algorithms is here discussed. Moreover, a recent method based on the min-W and max-M lattice auto-associative memories, that was proposed for endmember determination in hyperspectral imagery, is introduced as a classification method. Besides a discussion of their mathematical foundation, we emphasize their main characteristics and the results achieved for two exemplar images conformed by objects similar in appearance, but spectrally different. The classification results state that the first components computed from principal component analysis can be used to highlight areas with different spectral characteristics. In addition, the use of lattice auto-associative memories provides good results for materials classification even in the cases where some spectral similarities appears in their spectral responses.

  3. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  4. Identifying irradiated flours by photo-stimulated luminescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Ramli, Ros Anita Ahmad; Yasir, Muhamad Samudi [Faculty of Science and Technology, National University of Malaysia, Bangi, 43000 Kajang, Selangor (Malaysia); Othman, Zainon; Abdullah, Wan Saffiey Wan [Malaysian Nuclear Agency, Bangi 43000 Kajang, Selangor (Malaysia)

    2014-02-12

    Photo-stimulated luminescence (PSL) technique was used in this study to detect gamma irradiation treatment of five types of flours (corn, rice, tapioca, wheat and glutinous rice) at four different doses 0, 0.2, .05 and 1kGy. The signal level was compared with two threshold values (700 and 5000). With the exception of glutinous rice, all irradiated samples produced a strong signal above the upper threshold (5000 counts/60s). All control samples produced negative result with the signals below the lower threshold (700 counts/60s) suggesting that the samples have not been irradiated. Irradiated glutinous rice samples produced intermediate signals (700 - 5000 counts/60s) which were subsequently confirmed using calibrated PSL. The PSL signals remained stable after 90 days of storage. The findings of this study will be useful to facilitate control of food irradiation application in Malaysia.

  5. Identifying irradiated flours by photo-stimulated luminescence technique

    International Nuclear Information System (INIS)

    Ramli, Ros Anita Ahmad; Yasir, Muhamad Samudi; Othman, Zainon; Abdullah, Wan Saffiey Wan

    2014-01-01

    Photo-stimulated luminescence (PSL) technique was used in this study to detect gamma irradiation treatment of five types of flours (corn, rice, tapioca, wheat and glutinous rice) at four different doses 0, 0.2, .05 and 1kGy. The signal level was compared with two threshold values (700 and 5000). With the exception of glutinous rice, all irradiated samples produced a strong signal above the upper threshold (5000 counts/60s). All control samples produced negative result with the signals below the lower threshold (700 counts/60s) suggesting that the samples have not been irradiated. Irradiated glutinous rice samples produced intermediate signals (700 - 5000 counts/60s) which were subsequently confirmed using calibrated PSL. The PSL signals remained stable after 90 days of storage. The findings of this study will be useful to facilitate control of food irradiation application in Malaysia

  6. Protein functional links in Trypanosoma brucei, identified by gene fusion analysis

    Directory of Open Access Journals (Sweden)

    Trimpalis Philip

    2011-07-01

    Full Text Available Abstract Background Domain or gene fusion analysis is a bioinformatics method for detecting gene fusions in one organism by comparing its genome to that of other organisms. The occurrence of gene fusions suggests that the two original genes that participated in the fusion are functionally linked, i.e. their gene products interact either as part of a multi-subunit protein complex, or in a metabolic pathway. Gene fusion analysis has been used to identify protein functional links in prokaryotes as well as in eukaryotic model organisms, such as yeast and Drosophila. Results In this study we have extended this approach to include a number of recently sequenced protists, four of which are pathogenic, to identify fusion linked proteins in Trypanosoma brucei, the causative agent of African sleeping sickness. We have also examined the evolution of the gene fusion events identified, to determine whether they can be attributed to fusion or fission, by looking at the conservation of the fused genes and of the individual component genes across the major eukaryotic and prokaryotic lineages. We find relatively limited occurrence of gene fusions/fissions within the protist lineages examined. Our results point to two trypanosome-specific gene fissions, which have recently been experimentally confirmed, one fusion involving proteins involved in the same metabolic pathway, as well as two novel putative functional links between fusion-linked protein pairs. Conclusions This is the first study of protein functional links in T. brucei identified by gene fusion analysis. We have used strict thresholds and only discuss results which are highly likely to be genuine and which either have already been or can be experimentally verified. We discuss the possible impact of the identification of these novel putative protein-protein interactions, to the development of new trypanosome therapeutic drugs.

  7. Possibilities to employ noise analysis techniques in controlling nuclear power stations

    International Nuclear Information System (INIS)

    Alfonso Pallares, C.; Iglesias Ferrer, R.; Sarabia Molina, I.

    1998-01-01

    This work shows basic requirements the authors think must be complied with by monitoring systems for operational surveillance based on noise analysis techniques that in turn can be employed in the regulatory control

  8. Analytical methods to identify irradiated food

    International Nuclear Information System (INIS)

    Helle, N.; Schreiber, G.A.; Boegl, K.W.

    1992-01-01

    During the last years, three promising techniques for the identification of irradiated food were developed: - Studies of luminescence, mainly thermoluminescence measurements, of food containing mineral impurities like spices, dried vegetables: and fresh fuit and vegetables. This technique can probably be applied also to food with crystalline components like shells or bones. - Gaschromatographic/mass-spectrometric investigation of radiation-induced lipid changes. - Electron-spin-resonance measurements of dried products or of products containing dry components like bones, fish bones, shells or seeds. The thermoluminescence technique has been routinely applied for more than one year by several German Food Inspection Laboratories. The results suggest that there are scarcely any irradiated spices and dried vegetables in the German market. Gaschromatography/mass spectrometry of lipid components and electron-spin-resonance spectroscopy will be established in routine food inspections in Germany in the next two years. Further possibilities to identify irradiated food are the analysis of specific changes in amino acids, DNA and carbohydrates. Radiation-induced viscosity changes, and changes in electric properties (impedance) may be helpful in identifiying at least some irradiated products. Also microbiological and biological techniques as e.g. microbial flora shift or embryo development tests in citrus fruit have been considered. All activities concerning the development of identification techniques are now coordinated by the European Communities and by IAEA. (orig.) [de

  9. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  10. Terahertz spectral unmixing based method for identifying gastric cancer

    Science.gov (United States)

    Cao, Yuqi; Huang, Pingjie; Li, Xian; Ge, Weiting; Hou, Dibo; Zhang, Guangxin

    2018-02-01

    At present, many researchers are exploring biological tissue inspection using terahertz time-domain spectroscopy (THz-TDS) techniques. In this study, based on a modified hard modeling factor analysis method, terahertz spectral unmixing was applied to investigate the relationships between the absorption spectra in THz-TDS and certain biomarkers of gastric cancer in order to systematically identify gastric cancer. A probability distribution and box plot were used to extract the distinctive peaks that indicate carcinogenesis, and the corresponding weight distributions were used to discriminate the tissue types. The results of this work indicate that terahertz techniques have the potential to detect different levels of cancer, including benign tumors and polyps.

  11. Data Collection and Analysis Techniques for Evaluating the Perceptual Qualities of Auditory Stimuli

    Energy Technology Data Exchange (ETDEWEB)

    Bonebright, T.L.; Caudell, T.P.; Goldsmith, T.E.; Miner, N.E.

    1998-11-17

    This paper describes a general methodological framework for evaluating the perceptual properties of auditory stimuli. The framework provides analysis techniques that can ensure the effective use of sound for a variety of applications including virtual reality and data sonification systems. Specifically, we discuss data collection techniques for the perceptual qualities of single auditory stimuli including identification tasks, context-based ratings, and attribute ratings. In addition, we present methods for comparing auditory stimuli, such as discrimination tasks, similarity ratings, and sorting tasks. Finally, we discuss statistical techniques that focus on the perceptual relations among stimuli, such as Multidimensional Scaling (MDS) and Pathfinder Analysis. These methods are presented as a starting point for an organized and systematic approach for non-experts in perceptual experimental methods, rather than as a complete manual for performing the statistical techniques and data collection methods. It is our hope that this paper will help foster further interdisciplinary collaboration among perceptual researchers, designers, engineers, and others in the development of effective auditory displays.

  12. Statistical and Machine-Learning Data Mining Techniques for Better Predictive Modeling and Analysis of Big Data

    CERN Document Server

    Ratner, Bruce

    2011-01-01

    The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has

  13. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist

    2015-01-01

    The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......) identifiability problems. By using the presented approach it is possible to detect potential identifiability problems and avoid pointless calibration (and experimental!) effort....

  14. An Analysis of the Changes in Communication Techniques in the Italian Codes of Medical Deontology.

    Science.gov (United States)

    Conti, Andrea Alberto

    2017-04-28

    The code of deontology of the Italian National Federation of the Colleges of Physicians, Surgeons and Dentists (FNOMCeO) contains the principles and rules to which the professional medical practitioner must adhere. This work identifies and analyzes the medical-linguistic choices and the expressive techniques present in the different editions of the code, and evaluates their purpose and function, focusing on the first appearance and the subsequent frequency of key terms. Various aspects of the formal and expressive revisions of the eight editions of the Codes of Medical Deontology published after the Second World War (from 1947/48 to 2014) are here presented, starting from a brief comparison with the first edition of 1903. Formal characteristics, choices of medical terminology and the introduction of new concepts and communicative attitudes are here identified and evaluated. This paper, in presenting a quantitative and epistemological analysis of variations, modifications and confirmations in the different editions of the Italian code of medical deontology over the last century, enucleates and demonstrates the dynamic paradigm of changing attitudes in the medical profession. This analysis shows the evolution in medical-scientific communication as embodied in the Italian code of medical deontology. This code, in its adoption, changes and adaptations, as evidenced in its successive editions, bears witness to the expressions and attitudes pertinent to and characteristic of the deontological stance of the medical profession during the twentieth century.

  15. Analysis on expression of gene for flower shape in Dendrobium sonia mutants using differential display technique

    International Nuclear Information System (INIS)

    Affrida Abu Hassan; Ahmad Syazni Kamarudin; Nurul Nadia Aminuddin; Mohd Nazir Basiran

    2004-01-01

    In vitro mutagenesis on Dendrobium Sonia in MINT has produced mutants with wide range of flower form and colour variations. Among the mutants are plants with different flower size and shape. These changes could be caused by alterations to the expression level of the genes responsible for the characteristics. In this studies, Differential Display technique was used to identify and analyse altered gene expression at the mRNA level. Total RNA of the control and mutants were reversed transcribed using three anchored oligo-d T primers. Subsequently, these cDNAs were Pcr amplified in combination with 16 arbitrary primers. The amplified products were electrophoresed side by side on agarose gel. Differentially expressed bands are isolated for further analysis. (Author)

  16. Performance of dental impression materials: Benchmarking of materials and techniques by three-dimensional analysis.

    Science.gov (United States)

    Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G

    2015-01-01

    Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.

  17. Analysis and modification of blue sapphires from Rwanda by ion beam techniques

    Science.gov (United States)

    Bootkul, D.; Chaiwai, C.; Tippawan, U.; Wanthanachaisaeng, B.; Intarasiri, S.

    2015-12-01

    Blue sapphire is categorised in a corundum (Al2O3) group. The gems of this group are always amazed by their beauties and thus having high value. In this study, blue sapphires from Rwanda, recently came to Thai gemstone industry, are chosen for investigations. On one hand, we have applied Particle Induced X-ray Emission (PIXE), which is a highly sensitive and precise analytical technique that can be used to identify and quantify trace elements, for chemical analysis of the sapphires. Here we have found that the major element of blue sapphires from Rwanda is Al with trace elements such as Fe, Ti, Cr, Ga and Mg as are commonly found in normal blue sapphire. On the other hand, we have applied low and medium ion implantations for color improvement of the sapphire. It seems that a high amount of energy transferring during cascade collisions have altered the gems properties. We have clearly seen that the blue color of the sapphires have been intensified after nitrogen ion bombardment. In addition, the gems were also having more transparent and luster. The UV-Vis-NIR measurement detected the modification of their absorption properties, implying of the blue color increasing. Here the mechanism of these modifications is postulated and reported. In any point of view, the bombardment by using nitrogen ion beam is a promising technique for quality improvement of the blue sapphire from Rwanda.

  18. Research review and development trends of human reliability analysis techniques

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Dai Licao

    2011-01-01

    Human reliability analysis (HRA) methods are reviewed. The theoretical basis of human reliability analysis, human error mechanism, the key elements of HRA methods as well as the existing HRA methods are respectively introduced and assessed. Their shortcomings,the current research hotspot and difficult problems are identified. Finally, it takes a close look at the trends of human reliability analysis methods. (authors)

  19. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  20. Development of hotcell non-destructive examination techniques

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Uhn; Yu, S. C.; Kang, B. S.; Byun, K. S. [Chungbuk National University, Chungju (Korea)

    2002-01-01

    The purpose of this project is to establish non-destructive examination techniques which needs to determine the status of spent nuclear fuel and/or bundles. Through the project, we will establish an image reconstruction tomography which is a kind of non-destructive techniques in Hotcell. The tomography technique can be used to identify the 2-dimensional density distribution of fission products in the spent fuel rods and/or bundles. And form results of the measurement and analysis of magnetic properties of neutron irradiated material in the press vessel and reactor, we will develop some techniques to test its hardness and defects. In 2001, the first year, we have established mathematical background and necessary data and informations to develop the techniques. We will try to find some experimental results that are necessary in developing the Hotcell non-destructive examination techniques in the coming year. 14 refs., 65 figs., 5 tabs. (Author)

  1. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound sediments

    Directory of Open Access Journals (Sweden)

    Morgana Camacho

    2013-04-01

    Full Text Available Parasite findings in sambaquis (shell mounds are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis.

  2. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments

    Science.gov (United States)

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana MF; Silva, Rosângela; de Souza, Sheila Mendonça; Araujo, Adauto

    2013-01-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis. PMID:23579793

  3. Identifying regions of strong scattering at the core-mantle boundary from analysis of PKKP precursor energy

    Science.gov (United States)

    Rost, S.; Earle, P.S.

    2010-01-01

    We detect seismic scattering from the core-mantle boundary related to the phase PKKP (PK. KP) in data from small aperture seismic arrays in India and Canada. The detection of these scattered waves in data from small aperture arrays is new and allows a better characterization of the fine-scale structure of the deep Earth especially in the southern hemisphere. Their slowness vector is determined from array processing allowing location of the heterogeneities at the core-mantle boundary using back-projection techniques through 1D Earth models. We identify strong scattering at the core-mantle boundary (CMB) beneath the Caribbean, Patagonia and the Antarctic Peninsula as well as beneath southern Africa. An analysis of the scattering regions relative to sources and receivers indicates that these regions represent areas of increased scattering likely due to increased heterogeneities close to the CMB. The 1. Hz array data used in this study is most sensitive to heterogeneity with scale lengths of about 10. km. Given the small size of the scatterers, a chemical origin of the heterogeneities is likely. By comparing the location of the fine-scale heterogeneity to geodynamical models and tomographic images, we identify different scattering mechanisms in regions related to subduction (Caribbean and Patagonia) and dense thermo chemical piles (Southern Africa). ?? 2010 Elsevier B.V.

  4. Identification of hierarchy of dynamic domains in proteins: comparison of HDWA and HCCP techniques

    Directory of Open Access Journals (Sweden)

    Yesylevskyy S. O.

    2010-07-01

    Full Text Available Aim. There are several techniques for the identification of hierarchy of dynamic domains in proteins. The goal of this work is to compare systematically two recently developed techniques, HCCP and HDWA,on a set of proteins from diverse structural classes. Methods. HDWA and HCCP techniques are used. The HDWA technique is designed to identify hierarchically organized dynamic domains in proteins using the Molecular Dynamics (MD trajectories, while HCCP utilizes the normal modes of simplified elastic network models. Results. It is shown that the dynamic domains found by HDWA are consistent with the domains identified by HCCP and other techniques. At the same time HDWA identifies flexible mobile loops of proteins correctly, which is hard to achieve with other model-based domain identification techniques. Conclusion. HDWA is shown to be a powerful method of analysis of MD trajectories, which can be used in various areas of protein science.

  5. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Paranjape, A.H.; Pandit, S.S.; Shinde, S.S.; Ramanujam, A.; Dhumwad, R.K.

    1991-01-01

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  6. Exploring the potential of data mining techniques for the analysis of accident patterns

    DEFF Research Database (Denmark)

    Prato, Carlo Giacomo; Bekhor, Shlomo; Galtzur, Ayelet

    2010-01-01

    Research in road safety faces major challenges: individuation of the most significant determinants of traffic accidents, recognition of the most recurrent accident patterns, and allocation of resources necessary to address the most relevant issues. This paper intends to comprehend which data mining...... and association rules) data mining techniques are implemented for the analysis of traffic accidents occurred in Israel between 2001 and 2004. Results show that descriptive techniques are useful to classify the large amount of analyzed accidents, even though introduce problems with respect to the clear...... importance of input and intermediate neurons, and the relative importance of hundreds of association rules. Further research should investigate whether limiting the analysis to fatal accidents would simplify the task of data mining techniques in recognizing accident patterns without the “noise” probably...

  7. Identifying At-Risk Students in General Chemistry via Cluster Analysis of Affective Characteristics

    Science.gov (United States)

    Chan, Julia Y. K.; Bauer, Christopher F.

    2014-01-01

    The purpose of this study is to identify academically at-risk students in first-semester general chemistry using affective characteristics via cluster analysis. Through the clustering of six preselected affective variables, three distinct affective groups were identified: low (at-risk), medium, and high. Students in the low affective group…

  8. State-of-the-art review of quality assurance techniques for vitrified high level waste

    International Nuclear Information System (INIS)

    Miller, P.L.H.

    1984-07-01

    Quality assurance is required for certain chemical and physical properties of both the molten glass pour and the solidified glass within the stainless steel container. It is also required to monitor the physical condition of the container lid weld. A review is presented of techniques which are used or which might be adapted for use in the quality assurance of vitrified high level waste. For the most part only non-intrusive methods have been considered, however, some techniques which are not strictly non-intrusive have been reviewed where a non-intrusive technique has not been identified or where there are other advantages associated with the particular technique. In order to identify suitable candidate techniques reference has been made to an extensive literature survey and experts in the fields of nuclear waste technology, glass technology, non-destructive testing, chemical analysis and remote analysis have been contacted. The opinions of manufacturers and users of specific techniques have also been sought. A summary is also given of those techniques which can most readily be applied to the problem of quality assurance for vitrified waste as well as recommendations for further research into techniques which might be adapted to suit this application. (author)

  9. Behavior change techniques implemented in electronic lifestyle activity monitors: a systematic content analysis.

    Science.gov (United States)

    Lyons, Elizabeth J; Lewis, Zakkoyya H; Mayrsohn, Brian G; Rowland, Jennifer L

    2014-08-15

    Electronic activity monitors (such as those manufactured by Fitbit, Jawbone, and Nike) improve on standard pedometers by providing automated feedback and interactive behavior change tools via mobile device or personal computer. These monitors are commercially popular and show promise for use in public health interventions. However, little is known about the content of their feedback applications and how individual monitors may differ from one another. The purpose of this study was to describe the behavior change techniques implemented in commercially available electronic activity monitors. Electronic activity monitors (N=13) were systematically identified and tested by 3 trained coders for at least 1 week each. All monitors measured lifestyle physical activity and provided feedback via an app (computer or mobile). Coding was based on a hierarchical list of 93 behavior change techniques. Further coding of potentially effective techniques and adherence to theory-based recommendations were based on findings from meta-analyses and meta-regressions in the research literature. All monitors provided tools for self-monitoring, feedback, and environmental change by definition. The next most prevalent techniques (13 out of 13 monitors) were goal-setting and emphasizing discrepancy between current and goal behavior. Review of behavioral goals, social support, social comparison, prompts/cues, rewards, and a focus on past success were found in more than half of the systems. The monitors included a range of 5-10 of 14 total techniques identified from the research literature as potentially effective. Most of the monitors included goal-setting, self-monitoring, and feedback content that closely matched recommendations from social cognitive theory. Electronic activity monitors contain a wide range of behavior change techniques typically used in clinical behavioral interventions. Thus, the monitors may represent a medium by which these interventions could be translated for

  10. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    Jackson, J.F.

    1975-01-01

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  11. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  12. Application of the neutron noise analysis technique in nuclear power plants

    International Nuclear Information System (INIS)

    Lescano, Victor H.; Wentzeis, Luis M.

    1999-01-01

    Using the neutron noise analysis in nuclear power plants, and without producing any perturbation in the normal operation of the plant, information of the vibration state of the reactor internals and the behavior of the operating conditions of the reactor primary circuit can be obtained. In Argentina, the neutron noise analysis technique is applied in customary way in the nuclear power plants Atucha I and Embalse. A database was constructed and vibration frequencies corresponding to different reactor internals were characterized. Reactor internals with particular mechanical vibrations have been detected and localized. In the framing of a cooperation project between Argentina and Germany, we participated in the measurements, analysis and modelisation, using the neutron noise technique, in the Obrigheim and Gundremmingen nuclear power plants. In the nuclear power plant Obrigheim (PWR, 350 M We), correlations between the signals measured from self-power neutron detectors and accelerometers located inside the reactor core, were made. In the nuclear power plant Gundremmingen (BWR, 1200 M We) we participated in the study of a particular mechanical vibration detected in one of the instrumentation tube. (author)

  13. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  14. Assessing a new gene expression analysis technique for radiation biodosimetry applications

    Energy Technology Data Exchange (ETDEWEB)

    Manning, Grainne; Kabacik, Sylwia; Finnon, Paul; Paillier, Francois; Bouffler, Simon [Cancer Genetics and Cytogenetics, Biological Effects Department, Centre for Radiation, Chemical and Environmental Hazards, Health Protection Agency, Chilton, Didcot, Oxfordshire OX11 ORQ (United Kingdom); Badie, Christophe, E-mail: christophe.badie@hpa.org.uk [Cancer Genetics and Cytogenetics, Biological Effects Department, Centre for Radiation, Chemical and Environmental Hazards, Health Protection Agency, Chilton, Didcot, Oxfordshire OX11 ORQ (United Kingdom)

    2011-09-15

    The response to any radiation accident or incident involving actual or potential ionising radiation exposure requires accurate and rapid assessment of the doses received by individuals. The techniques available today for biodosimetry purposes are not fully adapted to rapid high-throughput measurements of exposures in large numbers of individuals. A recently emerging technique is based on gene expression analysis, as there are a number of genes which are radiation responsive in a dose-dependent manner. The present work aimed to assess a new technique which allows the detection of the level of expression of up to 800 genes without need of enzymatic reactions. In order to do so, human peripheral blood was exposed ex vivo to a range of x-ray doses from 5 mGy to 4 Gy of x-rays and the transcriptional expression of five radiation-responsive genes PHPT1, PUMA, CCNG1, DDB2 and MDM2 was studied by both the nCounter Digital Analyzer and Multiplex Quantitative Real-Time Polymerase Chain Reaction (MQRT-PCR) as the benchmark technology. Results from both techniques showed good correlation for all genes with R{sup 2} values ranging between 0.8160 and 0.9754. The reproducibility of the nCounter Digital Analyzer was also assessed in independent biological replicates and proved to be good. Although the slopes of the correlation of results obtained by the techniques suggest that MQRT-PCR is more sensitive than the nCounter Digital Analyzer, the nCounter Digital Analyzer provides sensitive and reliable data on modifications in gene expression in human blood exposed to radiation without enzymatic amplification of RNA prior to analysis.

  15. Morphologic characterization and quantitative analysis on in vitro bacteria by nuclear techniques of measurement

    International Nuclear Information System (INIS)

    Lopes, Joana D'Arc Ramos

    2001-10-01

    The great difficulty to identify microorganisms (bacteria) from infectious processes is related to the necessary time to obtain a reliable result, about 72 hours. The purpose of this work is to establish a faster method to characterize bacterial morphologies through the use of neutron radiography, which can take about 5 hours. The samples containing the microorganisms, bacteria with different morphologies, after the appropriate microbiologic procedures were incubated with B 10 for 30 minutes and soon after deposited in a plate of a solid detector of nuclear tracks (SSNTD), denominated CR-39. To obtain the images relative to bacteria, the detector was submitted to the flow of thermal neutrons of the order of 2.2 x 10 5 n/cm 2 .s from the J-9 channel of the Reactor Argonauta (IEN/CNEN). To observe the images from bacteria in each sample under an optical microscope, the sheets were chemically developed. The analysis of the images revealed morphologic differences among the genera (Gram positive from Gram-negative and coccus from bacillus), in samples containing either isolated or mixed bacteria. We thus verified the viability of the technique to achieve morphological characterization of different microorganisms. A quantitative approach seemed also to be feasible with the technique. The whole process took about 2 hours. (author)

  16. Methods to identify the unexplored diversity of microbial exopolysaccharides.

    Science.gov (United States)

    Rühmann, Broder; Schmid, Jochen; Sieber, Volker

    2015-01-01

    Microbial exopolysaccharides (EPS) are a structurally very diverse class of molecules. A number of them have found their application in rather diverging fields that extend from medicine, food, and cosmetics on the one side to construction, drilling, and chemical industry on the other side. The analysis of microbial strains for their competence in polysaccharide production has therefore been a major issue in the past, especially in the search for new polysaccharide variants among natural strain isolates. Concerning the fact that nearly all microbes carry the genetic equipment for the production of polysaccharides under specific conditions, the naturally provided EPS portfolio seems to be still massively underexplored. Therefore, there is a need for high throughput screening techniques capable of identifying novel variants of bacterial EPS with properties superior to the already described ones, or even totally new ones. A great variety of different techniques has been used in screening approaches for identifying microorganisms that are producing EPS in substantial amounts. Mucoid growth is often the method of choice for visual identification of EPS producing strains. Depending on the thickening characteristics of the polysaccharide, observation of viscosity in culture broth can also be an option to evaluate EPS production. Precipitation with different alcohols represents a common detection, isolation, and purification method for many EPS. A more quantitative approach is found in the total carbohydrate content analysis, normally determined, e.g., by phenol-sulfuric-acid-method. In addition, recently a new and reliable method for the detailed analysis of the monomeric composition and the presence of rare sugars and sugar substitutions has become available, which could give a first hint of the polymer structure of unknown EPS. This minireview will compare available methods and novel techniques and discuss their benefits and disadvantages.

  17. Methods to identify the unexplored diversity of microbial exopolysaccharides

    Directory of Open Access Journals (Sweden)

    Broder eRühmann

    2015-06-01

    Full Text Available Microbial exopolysaccharides (EPS are a structurally very diverse class of molecules. A number of them have found their application in rather diverging fields that extend from medicine, food and cosmetics on the one side to construction, drilling and chemical industry on the other side. The analysis of microbial strains for their competence in polysaccharide production has therefore been a major issue in the past, especially in the search for new polysaccharide variants among natural strain isolates. Concerning the fact that nearly all microbes carry the genetic equipment for the production of polysaccharides under specific conditions, the naturally provided EPS portfolio seems to be still massively underexplored. Therefore, there is a need for high throughput screening techniques capable of identifying novel variants of bacterial exopolysaccharides with properties superior to the already described ones, or even totally new ones. A great variety of different techniques has been used in screening approaches for identifying microorganisms that are producing EPS in substantial amounts. Mucoid growth is often the method of choice for visual identification of EPS producing strains. Depending on the thickening characteristics of the polysaccharide, observation of viscosity in culture broth can also be an option to evaluate EPS production. Precipitation with different alcohols represents a common detection, isolation and purification method for many EPS. A more quantitative approach is found in the total carbohydrate content analysis, normally determined e.g. by phenol-sulfuric-acid-method. In addition, recently a new and reliable method for the detailed analysis of the monomeric composition and the presence of rare sugars and sugar substitutions has become available, which could give a first hint of the polymer structure of unknown EPS. This minireview will compare available methods and novel techniques and discuss their benefits and disadvantages.

  18. Analysis of Atorvastatin in Commercial Solid Drugs using the TT-PIGE Technique

    International Nuclear Information System (INIS)

    Younes, G; Zahraman, K; Nsouli, B; Bejjani, A; Mahmoud, R; El-Yazbi, F

    2008-01-01

    The quantification of the active ingredient (Al) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. In the case of an active ingredient contains specific heteroatoms (F, S, Cl), elemental IBA techniques can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparations. This is an advantage when the number of sample is relatively large. In this work, we demonstrate the ability of the Thick Target PIGE technique for rapid and accurate quantification of low concentration AtorvastatinTM in three commercial anti-hyperlipidemic drugs (Lipitor, Liponorm and Storvas). (author)

  19. Analysis of Atorvastatin in Commercial Solid Drugs using the TT-PIGE Technique

    Energy Technology Data Exchange (ETDEWEB)

    Younes, G [Beirut Arab University, Faculty of Science, Chemistry Department Beirut (Lebanon); Zahraman, K; Nsouli, B; Bejjani, A [Lebanese Atomic Energy Commission, National Council for Scientific Research, Beirut (Lebanon); Mahmoud, R; El-Yazbi, F [Beirut Arab University, Faculty of Pharmacy, Department of Pharmaceutical and Analytical Chemistry, Beirut (Lebanon)

    2008-07-01

    The quantification of the active ingredient (Al) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. In the case of an active ingredient contains specific heteroatoms (F, S, Cl), elemental IBA techniques can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparations. This is an advantage when the number of sample is relatively large. In this work, we demonstrate the ability of the Thick Target PIGE technique for rapid and accurate quantification of low concentration AtorvastatinTM in three commercial anti-hyperlipidemic drugs (Lipitor, Liponorm and Storvas). (author)

  20. The development of human factors technologies -The development of human behaviour analysis techniques-

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Heui; Park, Keun Ok; Chun, Se Woo; Suh, Sang Moon; Park, Jae Chang [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    In order to contribute to human error reduction through the studies on human-machine interaction in nuclear power plants, this project has objectives to develop SACOM(Simulation Analyzer with a Cognitive Operator Model) and techniques for human error analysis and application. In this year, we studied the followings: (1) Site investigation of operator tasks, (2) Development of operator task micro structure and revision of micro structure, (3) Development of knowledge representation software and SACOM prototype, (4) Development of performance assessment methodologies in task simulation and analysis of the effects of performance shaping factors. analysis and application techniques> (1) Classification of error shaping factors(ESFs) and development of software for ESF evaluation, (2) Analysis of human error occurrences and revision of analysis procedure, (3) Experiment for human error data collection using a compact nuclear simulator, (4) Development of a prototype data base system of the analyzed information on trip cases. 55 figs, 23 tabs, 33 refs. (Author).

  1. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhof, Hendrik, E-mail: Hendrik.Kohlhof@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Heidt, Christoph, E-mail: Christoph.heidt@kispi.uzh.ch [Department of Orthopedic Surgery, University Children' s Hospital Zurich, Steinwiesstrasse 74, 8032 Switzerland (Switzerland); Bähler, Alexandrine, E-mail: Alexandrine.baehler@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Kohl, Sandro, E-mail: sandro.kohl@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Gravius, Sascha, E-mail: sascha.gravius@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Friedrich, Max J., E-mail: Max.Friedrich@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Ziebarth, Kai, E-mail: kai.ziebarth@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Stranzinger, Enno, E-mail: Enno.Stranzinger@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland)

    2015-06-15

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  2. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    International Nuclear Information System (INIS)

    Kohlhof, Hendrik; Heidt, Christoph; Bähler, Alexandrine; Kohl, Sandro; Gravius, Sascha; Friedrich, Max J.; Ziebarth, Kai; Stranzinger, Enno

    2015-01-01

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  3. A borax fusion technique for quantitative X-ray fluorescence analysis

    NARCIS (Netherlands)

    van Willigen, J.H.H.G.; Kruidhof, H.; Dahmen, E.A.M.F.

    1971-01-01

    A borax fusion technique to cast glass discs for quantitative X-ray analysis is described in detail. The method is based on the “nonwetting” properties of a Pt/Au alloy towards molten borax, on the favourable composition of the flux and finally on the favourable form of the casting mould. The

  4. Charting the trends in nuclear techniques for analysis of inorganic environmental pollutants

    International Nuclear Information System (INIS)

    Braun, T.

    1986-01-01

    Publications in Analytical Abstracts in the period 1975-1984 and papers presented at the Modern Trends in Activation Analysis international conferences series in the period 1961-1986 have been used as an empirical basis for assessing general trends in research and publication activity. Some ebbs and flows in the speciality of instrumental techniques for analysis of environmental trace pollutants are revealed by a statistical analysis of the publications. (author)

  5. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  6. New X-Ray Technique to Characterize Nanoscale Precipitates in Aged Aluminum Alloys

    Science.gov (United States)

    Sitdikov, V. D.; Murashkin, M. Yu.; Valiev, R. Z.

    2017-10-01

    This paper puts forward a new technique for measurement of x-ray patterns, which enables to solve the problem of identification and determination of precipitates (nanoscale phases) in metallic alloys of the matrix type. The minimum detection limit of precipitates in the matrix of the base material provided by this technique constitutes as little as 1%. The identification of precipitates in x-ray patterns and their analysis are implemented through a transmission mode with a larger radiation area, longer holding time and higher diffractometer resolution as compared to the conventional reflection mode. The presented technique has been successfully employed to identify and quantitatively describe precipitates formed in the Al alloy of the Al-Mg-Si system as a result of artificial aging. For the first time, the x-ray phase analysis has been used to identify and measure precipitates formed during the alloy artificial aging.

  7. The Art of Athlete Leadership: Identifying High-Quality Athlete Leadership at the Individual and Team Level Through Social Network Analysis.

    Science.gov (United States)

    Fransen, Katrien; Van Puyenbroeck, Stef; Loughead, Todd M; Vanbeselaere, Norbert; De Cuyper, Bert; Vande Broek, Gert; Boen, Filip

    2015-06-01

    This research aimed to introduce social network analysis as a novel technique in sports teams to identify the attributes of high-quality athlete leadership, both at the individual and at the team level. Study 1 included 25 sports teams (N = 308 athletes) and focused on athletes' general leadership quality. Study 2 comprised 21 sports teams (N = 267 athletes) and focused on athletes' specific leadership quality as a task, motivational, social, and external leader. The extent to which athletes felt connected with their leader proved to be most predictive for athletes' perceptions of that leader's quality on each leadership role. Also at the team level, teams with higher athlete leadership quality were more strongly connected. We conclude that social network analysis constitutes a valuable tool to provide more insight in the attributes of high-quality leadership both at the individual and at the team level.

  8. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  9. Rice Transcriptome Analysis to Identify Possible Herbicide Quinclorac Detoxification Genes

    Directory of Open Access Journals (Sweden)

    Wenying eXu

    2015-09-01

    Full Text Available Quinclorac is a highly selective auxin-type herbicide, and is widely used in the effective control of barnyard grass in paddy rice fields, improving the world’s rice yield. The herbicide mode of action of quinclorac has been proposed and hormone interactions affect quinclorac signaling. Because of widespread use, quinclorac may be transported outside rice fields with the drainage waters, leading to soil and water pollution and environmental health problems.In this study, we used 57K Affymetrix rice whole-genome array to identify quinclorac signaling response genes to study the molecular mechanisms of action and detoxification of quinclorac in rice plants. Overall, 637 probe sets were identified with differential expression levels under either 6 or 24 h of quinclorac treatment. Auxin-related genes such as GH3 and OsIAAs responded to quinclorac treatment. Gene Ontology analysis showed that genes of detoxification-related family genes were significantly enriched, including cytochrome P450, GST, UGT, and ABC and drug transporter genes. Moreover, real-time RT-PCR analysis showed that top candidate P450 families such as CYP81, CYP709C and CYP72A genes were universally induced by different herbicides. Some Arabidopsis genes for the same P450 family were up-regulated under quinclorac treatment.We conduct rice whole-genome GeneChip analysis and the first global identification of quinclorac response genes. This work may provide potential markers for detoxification of quinclorac and biomonitors of environmental chemical pollution.

  10. AcuI identifies water buffalo CSN3 genotypes by RFLP analysis

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 93; Online resources. AcuI identifies water buffalo CSN3 genotypes by RFLP analysis. Soheir M. El Nahas Ahlam A. Abou Mossallam. Volume 93 Online resources 2014 pp e94-e96. Fulltext. Click here to view fulltext PDF. Permanent link:

  11. Multiscale analysis of damage using dual and primal domain decomposition techniques

    NARCIS (Netherlands)

    Lloberas-Valls, O.; Everdij, F.P.X.; Rixen, D.J.; Simone, A.; Sluys, L.J.

    2014-01-01

    In this contribution, dual and primal domain decomposition techniques are studied for the multiscale analysis of failure in quasi-brittle materials. The multiscale strategy essentially consists in decomposing the structure into a number of nonoverlapping domains and considering a refined spatial

  12. Large-scale association analysis identifies 13 new susceptibility loci for coronary artery disease

    NARCIS (Netherlands)

    Schunkert, Heribert; König, Inke R.; Kathiresan, Sekar; Reilly, Muredach P.; Assimes, Themistocles L.; Holm, Hilma; Preuss, Michael; Stewart, Alexandre F. R.; Barbalic, Maja; Gieger, Christian; Absher, Devin; Aherrahrou, Zouhair; Allayee, Hooman; Altshuler, David; Anand, Sonia S.; Andersen, Karl; Anderson, Jeffrey L.; Ardissino, Diego; Ball, Stephen G.; Balmforth, Anthony J.; Barnes, Timothy A.; Becker, Diane M.; Becker, Lewis C.; Berger, Klaus; Bis, Joshua C.; Boekholdt, S. Matthijs; Boerwinkle, Eric; Braund, Peter S.; Brown, Morris J.; Burnett, Mary Susan; Buysschaert, Ian; Carlquist, John F.; Chen, Li; Cichon, Sven; Codd, Veryan; Davies, Robert W.; Dedoussis, George; Dehghan, Abbas; Demissie, Serkalem; Devaney, Joseph M.; Diemert, Patrick; Do, Ron; Doering, Angela; Eifert, Sandra; Mokhtari, Nour Eddine El; Ellis, Stephen G.; Elosua, Roberto; Engert, James C.; Epstein, Stephen E.; de Faire, Ulf; Fischer, Marcus; Folsom, Aaron R.; Freyer, Jennifer; Gigante, Bruna; Girelli, Domenico; Gretarsdottir, Solveig; Gudnason, Vilmundur; Gulcher, Jeffrey R.; Halperin, Eran; Hammond, Naomi; Hazen, Stanley L.; Hofman, Albert; Horne, Benjamin D.; Illig, Thomas; Iribarren, Carlos; Jones, Gregory T.; Jukema, J. Wouter; Kaiser, Michael A.; Kaplan, Lee M.; Kastelein, John J. P.; Khaw, Kay-Tee; Knowles, Joshua W.; Kolovou, Genovefa; Kong, Augustine; Laaksonen, Reijo; Lambrechts, Diether; Leander, Karin; Lettre, Guillaume; Li, Mingyao; Lieb, Wolfgang; Loley, Christina; Lotery, Andrew J.; Mannucci, Pier M.; Maouche, Seraya; Martinelli, Nicola; McKeown, Pascal P.; Meisinger, Christa; Meitinger, Thomas; Melander, Olle; Merlini, Pier Angelica; Mooser, Vincent; Morgan, Thomas; Mühleisen, Thomas W.; Muhlestein, Joseph B.; Münzel, Thomas; Musunuru, Kiran; Nahrstaedt, Janja; Nelson, Christopher P.; Nöthen, Markus M.; Olivieri, Oliviero; Patel, Riyaz S.; Patterson, Chris C.; Peters, Annette; Peyvandi, Flora; Qu, Liming; Quyyumi, Arshed A.; Rader, Daniel J.; Rallidis, Loukianos S.; Rice, Catherine; Rosendaal, Frits R.; Rubin, Diana; Salomaa, Veikko; Sampietro, M. Lourdes; Sandhu, Manj S.; Schadt, Eric; Schäfer, Arne; Schillert, Arne; Schreiber, Stefan; Schrezenmeir, Jürgen; Schwartz, Stephen M.; Siscovick, David S.; Sivananthan, Mohan; Sivapalaratnam, Suthesh; Smith, Albert; Smith, Tamara B.; Snoep, Jaapjan D.; Soranzo, Nicole; Spertus, John A.; Stark, Klaus; Stirrups, Kathy; Stoll, Monika; Tang, W. H. Wilson; Tennstedt, Stephanie; Thorgeirsson, Gudmundur; Thorleifsson, Gudmar; Tomaszewski, Maciej; Uitterlinden, Andre G.; van Rij, Andre M.; Voight, Benjamin F.; Wareham, Nick J.; Wells, George A.; Wichmann, H.-Erich; Wild, Philipp S.; Willenborg, Christina; Witteman, Jaqueline C. M.; Wright, Benjamin J.; Ye, Shu; Zeller, Tanja; Ziegler, Andreas; Cambien, Francois; Goodall, Alison H.; Cupples, L. Adrienne; Quertermous, Thomas; März, Winfried; Hengstenberg, Christian; Blankenberg, Stefan; Ouwehand, Willem H.; Hall, Alistair S.; Deloukas, Panos; Thompson, John R.; Stefansson, Kari; Roberts, Robert; Thorsteinsdottir, Unnur; O'Donnell, Christopher J.; McPherson, Ruth; Erdmann, Jeanette; Samani, Nilesh J.

    2011-01-01

    We performed a meta-analysis of 14 genome-wide association studies of coronary artery disease (CAD) comprising 22,233 individuals with CAD (cases) and 64,762 controls of European descent followed by genotyping of top association signals in 56,682 additional individuals. This analysis identified 13

  13. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 2: robustness of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples

  14. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  15. Mutation analysis with random DNA identifiers (MARDI) catalogs Pig-a mutations in heterogeneous pools of CD48-deficient T cells derived from DMBA-treated rats.

    Science.gov (United States)

    Revollo, Javier R; Crabtree, Nathaniel M; Pearce, Mason G; Pacheco-Martinez, M Monserrat; Dobrovolsky, Vasily N

    2016-03-01

    Identification of mutations induced by xenotoxins is a common task in the field of genetic toxicology. Mutations are often detected by clonally expanding potential mutant cells and genotyping each viable clone by Sanger sequencing. Such a "clone-by-clone" approach requires significant time and effort, and sometimes is even impossible to implement. Alternative techniques for efficient mutation identification would greatly benefit both basic and regulatory genetic toxicology research. Here, we report the development of Mutation Analysis with Random DNA Identifiers (MARDI), a novel high-fidelity Next Generation Sequencing (NGS) approach that circumvents clonal expansion and directly catalogs mutations in pools of mutant cells. MARDI uses oligonucleotides carrying Random DNA Identifiers (RDIs) to tag progenitor DNA molecules before PCR amplification, enabling clustering of descendant DNA molecules and eliminating NGS- and PCR-induced sequencing artifacts. When applied to the Pig-a cDNA analysis of heterogeneous pools of CD48-deficient T cells derived from DMBA-treated rats, MARDI detected nearly all Pig-a mutations that were previously identified by conventional clone-by-clone analysis and discovered many additional ones consistent with DMBA exposure: mostly A to T transversions, with the mutated A located on the non-transcribed DNA strand. © 2015 Wiley Periodicals, Inc.

  16. Rapid analysis of molybdenum contents in molybdenum master alloys by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Tongkong, P.

    1985-01-01

    Determination of molybdenum contents in molybdenum master alloy had been performed using energy dispersive x-ray fluorescence (EDX) technique where analysis were made via standard additions and calibration curves. Comparison of EDX technique with other analyzing techniques, i.e., wavelength dispersive x-ray fluorescence, neutron activation analysis and inductive coupled plasma spectrometry, showed consistency in the results. This technique was found to yield reliable results when molybdenum contents in master alloys were in the range of 13 to 50 percent using HPGe detector or proportional counter. When the required error was set at 1%, the minimum analyzing time was found to be 30 and 60 seconds for Fe-Mo master alloys with molybdenum content of 13.54 and 49.09 percent respectively. For Al-Mo master alloys, the minimum times required were 120 and 300 seconds with molybdenum content of 15.22 and 47.26 percent respectively

  17. Analysis of archaeological pieces with nuclear techniques; Analisis de piezas arqueologicas con tecnicas nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Tenorio, D [Instituto Nacional de Investigaciones Nucleares, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2002-07-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  18. DESIGN & ANALYSIS TOOLS AND TECHNIQUES FOR AEROSPACE STRUCTURES IN A 3D VISUAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Radu BISCA

    2009-09-01

    Full Text Available The main objective of this project is to develop a set of tools and to integrate techniques in a software package which is build on structure analysis applications based on Romanian engineers experience in designing and analysing aerospace structures, consolidated with the most recent methods and techniques. The applications automates the structure’s design and analysis processes and facilitate the exchange of technical information between the partners involved in a complex aerospace project without limiting the domain.

  19. Micro-computed tomography and bond strength analysis of different root canal filling techniques

    Directory of Open Access Journals (Sweden)

    Juliane Nhata

    2014-01-01

    Full Text Available Introduction: The aim of this study was to evaluate the quality and bond strength of three root filling techniques (lateral compaction, continuous wave of condensation and Tagger′s Hybrid technique [THT] using micro-computed tomography (CT images and push-out tests, respectively. Materials and Methods: Thirty mandibular incisors were prepared using the same protocol and randomly divided into three groups (n = 10: Lateral condensation technique (LCT, continuous wave of condensation technique (CWCT, and THT. All specimens were filled with Gutta-percha (GP cones and AH Plus sealer. Five specimens of each group were randomly chosen for micro-CT analysis and all of them were sectioned into 1 mm slices and subjected to push-out tests. Results: Micro-CT analysis revealed less empty spaces when GP was heated within the root canals in CWCT and THT when compared to LCT. Push-out tests showed that LCT and THT had a significantly higher displacement resistance (P < 0.05 when compared to the CWCT. Bond strength was lower in apical and middle thirds than in the coronal thirds. Conclusions: It can be concluded that LCT and THT were associated with higher bond strengths to intraradicular dentine than CWCT. However, LCT was associated with more empty voids than the other techniques.

  20. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    Science.gov (United States)

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  1. A proteomic analysis identifies candidate early biomarkers to predict ovarian hyperstimulation syndrome in polycystic ovarian syndrome patients.

    Science.gov (United States)

    Wu, Lan; Sun, Yazhou; Wan, Jun; Luan, Ting; Cheng, Qing; Tan, Yong

    2017-07-01

    Ovarian hyperstimulation syndrome (OHSS) is a potentially life‑threatening, iatrogenic complication that occurs during assisted reproduction. Polycystic ovarian syndrome (PCOS) significantly increases the risk of OHSS during controlled ovarian stimulation. Therefore, a more effective early prediction technique is required in PCOS patients. Quantitative proteomic analysis of serum proteins indicates the potential diagnostic value for disease. In the present study, the authors revealed the differentially expressed proteins in OHSS patients with PCOS as new diagnostic biomarkers. The promising proteins obtained from liquid chromatography‑mass spectrometry were subjected to ELISA and western blotting assay for further confirmation. A total of 57 proteins were identified with significant difference, of which 29 proteins were upregulated and 28 proteins were downregulated in OHSS patients. Haptoglobin, fibrinogen and lipoprotein lipase were selected as candidate biomarkers. Receiver operating characteristic curve analysis demonstrated all three proteins may have potential as biomarkers to discriminate OHSS in PCOS patients. Haptoglobin, fibrinogen and lipoprotein lipase have never been reported as a predictive marker of OHSS in PCOS patients, and their potential roles in OHSS occurrence deserve further studies. The proteomic results reported in the present study may gain deeper insights into the pathophysiology of OHSS.

  2. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis

  3. Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.

    Science.gov (United States)

    Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini

    2016-01-01

    This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.

  4. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  5. What's down below? Current and potential future applications of geophysical techniques to identify subsurface permafrost conditions (Invited)

    Science.gov (United States)

    Douglas, T. A.; Bjella, K.; Campbell, S. W.

    2013-12-01

    For infrastructure design, operations, and maintenance requirements in the North the ability to accurately and efficiently detect the presence (or absence) of ground ice in permafrost terrains is a serious challenge. Ground ice features including ice wedges, thermokarst cave-ice, and segregation ice are present in a variety of spatial scales and patterns. Currently, most engineering applications use borehole logging and sampling to extrapolate conditions at the point scale. However, there is high risk of over or under estimating the presence of frozen or unfrozen features when relying on borehole information alone. In addition, boreholes are costly, especially for planning linear structures like roads or runways. Predicted climate warming will provide further challenges for infrastructure development and transportation operations where permafrost degradation occurs. Accurately identifying the subsurface character in permafrost terrains will allow engineers and planners to cost effectively create novel infrastructure designs to withstand the changing environment. There is thus a great need for a low cost rapidly deployable, spatially extensive means of 'measuring' subsurface conditions. Geophysical measurements, both terrestrial and airborne, have strong potential to revolutionize our way of mapping subsurface conditions. Many studies in continuous and discontinuous permafrost have used geophysical measurements to identify discrete features and repeatable patterns in the subsurface. The most common measurements include galvanic and capacitive coupled resistivity, ground penetrating radar, and multi frequency electromagnetic induction techniques. Each of these measurements has strengths, weaknesses, and limitations. By combining horizontal geophysical measurements, downhole geophysics, multispectral remote sensing images, LiDAR measurements, and soil and vegetation mapping we can start to assemble a holistic view of how surface conditions and standoff measurements

  6. The potential of dielectric analysis as an on-line cure monitoring technique in the manufacture of advanced fibre reinforced composites

    International Nuclear Information System (INIS)

    McIlhagger, A.T.

    2002-02-01

    Composite manufacturing processes such as RTM, are being developed in the aerospace industry in order to promote and reduce the cost of advanced fibre reinforced composites. The aerospace industry has identified the need for a cure monitoring system to be utilised in this production, to improve the efficiency and reliability of processing. The system must be able to determine through-thickness properties of the composite, on-line and without affecting the integrity of the finished component. Literature has shown that a number of techniques are available but these do not address all of the requirements of industry. The most important process parameters in RTM are the resin flow, point of minimum viscosity, gelation and subsequent completion of cure. These 'key cure parameters' are often difficult to control accurately in the manufacturing environment. Although dielectric analysis has been around for many years, literature identified an urgent requirement for research on the interpretation of dielectric sensor data relating to these main process parameters. A dielectric laboratory instrument, operated in the parallel plate sensor configuration was utilised to simulate a standard RTM cure cycle. The important transitions in the resin, namely minimum viscosity, gelation, vitrification and completion of cure, were identified. The parallel plate dielectric technique was applied to composites containing conductive and non-conductive reinforcement fibres. The appropriate dielectric signals and frequency were determined based on the sensor configuration, insulating layer and resin/fabric type. Correlations have been demonstrated between dielectric results and other established thermal (DSC and. DMA) and mechanical test techniques (tensile, flexural and interlaminar shear). Test methods were designed and investigated to provide information to compare with dielectric data. The parallel plate configuration was used to investigate the effect of composite thickness variation on

  7. Nonliner analysis techniques for use in the assessment of high-level waste storage tank structures

    International Nuclear Information System (INIS)

    Moore, C.J.; Julyk, L.J.; Fox, G.L.; Dyrness, A.D.

    1991-09-01

    Reinforced concrete in combination with a steel liner has had a wide application to structures containing hazardous material. The buried double-shell waste storage tanks at the US Department of Energy's Hanford Site use this construction method. The generation and potential ignition of combustible gases within the primary tank is postulated to develop beyond-design-basis internal pressure and possible impact loading. The scope of this paper includes the illustration of analysis techniques for the assessment of these beyond-design-basis loadings. The analysis techniques include the coupling of the gas dynamics with the structural response, the treatment of reinforced concrete in regimes of inelastic behavior, and the treatment of geometric nonlinearities. The techniques and software tools presented provide a powerful nonlinear analysis capability for storage tanks. 10 refs., 13 figs., 1 tab

  8. A Multimodal Data Analysis Approach for Targeted Drug Discovery Involving Topological Data Analysis (TDA).

    Science.gov (United States)

    Alagappan, Muthuraman; Jiang, Dadi; Denko, Nicholas; Koong, Albert C

    In silico drug discovery refers to a combination of computational techniques that augment our ability to discover drug compounds from compound libraries. Many such techniques exist, including virtual high-throughput screening (vHTS), high-throughput screening (HTS), and mechanisms for data storage and querying. However, presently these tools are often used independent of one another. In this chapter, we describe a new multimodal in silico technique for the hit identification and lead generation phases of traditional drug discovery. Our technique leverages the benefits of three independent methods-virtual high-throughput screening, high-throughput screening, and structural fingerprint analysis-by using a fourth technique called topological data analysis (TDA). We describe how a compound library can be independently tested with vHTS, HTS, and fingerprint analysis, and how the results can be transformed into a topological data analysis network to identify compounds from a diverse group of structural families. This process of using TDA or similar clustering methods to identify drug leads is advantageous because it provides a mechanism for choosing structurally diverse compounds while maintaining the unique advantages of already established techniques such as vHTS and HTS.

  9. Performance values of nondestructive analysis techniques in safeguards and nuclear materials management

    International Nuclear Information System (INIS)

    Guardini, S.

    1989-01-01

    Nondestructive assay (NDA) techniques have, in the past few years, become more and more important in nuclear material accountancy and control. This is essentially due to two reasons: (1) The improvements made in most NDA techniques led some of them to have performances close to destructive analysis (DA) (e.g., calorimetry and gamma spectrometry). (2) The parallel improvement of statistical tools and procedural inspection approaches led to abandoning the following scheme: (a) NDA for semiqualitative or consistency checks only (b) DA for quantitative measurements. As a consequence, NDA is now frequently used in scenarios that involve quantitative (by variable) analysis. On the other hand, it also became evident that the performances of some techniques were different depending on whether they were applied in the laboratory or in the field. It has only recently been realized that, generally speaking, this is due to objective reasons rather than to an incorrect application of the instruments. Speaking of claimed and actual status of NDA performances might be in this sense misleading; one should rather say: performances in different conditions. This paper provides support for this assumption

  10. Exergy costs analysis of water desalination and purification techniques by transfer functions

    International Nuclear Information System (INIS)

    Carrasquer, Beatriz; Martínez-Gracia, Amaya; Uche, Javier

    2016-01-01

    Highlights: • A procedure to estimate the unit exergy cost of water treatment techniques is provided. • Unit exergy costs of water purification and desalination are given as a function of design and operating parameters. • Unit exergy costs range from 3.3 to 6.8 in purification and from 2 to 26 in desalination. • They could be used in their preliminary design as good indicators of their energy efficiency. - Abstract: The unit exergy costs of desalination and purification, which are two alternatives commonly used for water supply and treatment, have been characterized as a function of the energy efficiency of the process by combining the Exergy Cost Analysis with Transfer Function Analysis. An equation to assess the exergy costs of these alternatives is then proposed as a quick guide to know the energy efficiency of any water treatment process under different design and operating conditions. This combination, was satisfactory applied to groundwaters and water transfers. After identifying the boundaries of the system, input and output flows are calculated in exergy values. Next, different examples are analyzed in order to propose a generic equation to assess the exergy cost of the water restoration technologies, attending to their main features. Recovery ratio, energy requirements and salts concentrations (for desalination), and plant capacity and organic matter recovery (for water purification) are introduced in the calculations as their main endogenous parameters. Values obtained for typical operation ranges of commercial plants showed that unit exergy costs of water purification ranged from 3.3 to 6.8; maximum values, as expected, were found at low plant capacities and high organic matter removal ratios. For water desalination, values varied from 2 to 7 in membrane technologies and from 10 to 26 in thermal processes. The recovery ratio and salts concentration in raw water increased the unit exergy costs in membrane techniques. In distillation processes

  11. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  12. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, Mogens; Borch, Ole; Bagnoli, F.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  13. Metabolomic analysis using porcine skin: a pilot study of analytical techniques.

    Science.gov (United States)

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-06-15

    Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. Using porcine skin samples, we pulverized the skin via various combinations of mechanical techniques for cell lysage. After extraction, the samples were subjected to GC-TOF-MS and/or UHPLC-QTOF-MS. Signal intensities from GC-TOF-MS analysis showed that ultrasonication (2.7x107) was most effective for cell lysage when compared to mortar-and-pestle (2.6x107), ball mill followed by ultrasonication (1.6x107), mortar-and-pestle followed by ultrasonication (1.4x107), and homogenization (trial 1: 8.4x106; trial 2: 1.6x107). Due to the similar signal intensities, ultrasonication and mortar-and-pestle were applied to additional samples and subjected to GC-TOF-MS and UHPLC-QTOF-MS. Ultrasonication yielded greater signal intensities than mortar-and-pestle for 92% of detected metabolites following GC-TOF-MS and for 68% of detected metabolites following UHPLC-QTOF-MS. Overall, ultrasonication is the preferred method for efficient cell lysage of skin tissue for both metabolomic platforms. With standardized sample preparation, metabolomic analysis of skin can serve as a powerful tool in elucidating underlying biological processes in dermatological conditions.

  14. Multiplex Ligation-Dependent Probe Amplification Technique for Copy Number Analysis on Small Amounts of DNA Material

    DEFF Research Database (Denmark)

    Sørensen, Karina; Andersen, Paal; Larsen, Lars

    2008-01-01

    The multiplex ligation-dependent probe amplification (MLPA) technique is a sensitive technique for relative quantification of up to 50 different nucleic acid sequences in a single reaction, and the technique is routinely used for copy number analysis in various syndromes and diseases. The aim...... of the study was to exploit the potential of MLPA when the DNA material is limited. The DNA concentration required in standard MLPA analysis is not attainable from dried blood spot samples (DBSS) often used in neonatal screening programs. A novel design of MLPA probes has been developed to permit for MLPA...... analysis on small amounts of DNA. Six patients with congenital adrenal hyperplasia (CAH) were used in this study. DNA was extracted from both whole blood and DBSS and subjected to MLPA analysis using normal and modified probes. Results were analyzed using GeneMarker and manual Excel analysis. A total...

  15. Financial planning and analysis techniques of mining firms: a note on Canadian practice

    Energy Technology Data Exchange (ETDEWEB)

    Blanco, H.; Zanibbi, L.R. (Laurentian University, Sudbury, ON (Canada). School of Commerce and Administration)

    1992-06-01

    This paper reports on the results of a survey of the financial planning and analysis techniques in use in the mining industry in Canada. The study was undertaken to determine the current status of these practices within mining firms in Canada and to investigate the extent to which the techniques are grouped together within individual firms. In addition, tests were performed on the relationship between these groups of techniques and both organizational size and price volatility of end product. The results show that a few techniques are widely utilized in this industry but that the techniques used most frequently are not as sophisticated as reported in previous, more broadly based surveys. The results also show that firms tend to use 'bundles' of techniques and that the relative use of some of these groups of techniques is weakly associated with both organizational size and type of end product. 19 refs., 7 tabs.

  16. Analysis of various NDT techniques to determine their feasibility for detecting thin layers of ferrite on Type 316 stainless steel

    International Nuclear Information System (INIS)

    Dudder, G.B.; Atteridge, D.G.; Davis, T.J.

    1978-09-01

    The applicability of various NDT techniques for detecting thin layers of ferrite on Type 316 stainless steel cladding was studied. The ability to detect sodium-induced ferrite layers on fuel pins would allow an experimental determination of the fuel pin temperature distribution. The research effort was broken down into three basic sections. Phase one consisted of a theoretical determination of the ferrite detection potential of each of the propsed NDT techniques. The second phase consisted of proof-of-principle experiments on the techniques that passed phase one. The third phase consisted of in-hot cell testing on actual EBR-II fuel pins. Most of the candidate techniques were eliminated in the first phase of analysis. Four potential techniques passed the initial phase of analysis but only three of these passed the second analysis phase. The three techniques that passed the proof-of-principle section of analysis were heat tinting, magnetic force and electromagnetic techniques. The electromagnetic technique was successfully demonstrated on actual fuel pins irradiated in EBR-II in the third phase of analysis while the other two techniques were not carried to the hot cell analysis phase. Results of this technique screening study indicates that an electromagnetic and/or heat tinting ferrite layer NDT technique should be readily adoptable to hot cell inspection requirements. It wasalso concluded that the magnetic force technique, while feasible, would not readily lend itself to hot cell fuel pin inspection

  17. Gene expression meta-analysis identifies chromosomal regions involved in ovarian cancer survival

    DEFF Research Database (Denmark)

    Thomassen, Mads; Jochumsen, Kirsten M; Mogensen, Ole

    2009-01-01

    the relation of gene expression and chromosomal position to identify chromosomal regions of importance for early recurrence of ovarian cancer. By use of *Gene Set Enrichment Analysis*, we have ranked chromosomal regions according to their association to survival. Over-representation analysis including 1...... using death (P = 0.015) and recurrence (P = 0.002) as outcome. The combined mutation score is strongly associated to upregulation of several growth factor pathways....

  18. Risk analysis of geothermal power plants using Failure Modes and Effects Analysis (FMEA) technique

    International Nuclear Information System (INIS)

    Feili, Hamid Reza; Akar, Navid; Lotfizadeh, Hossein; Bairampour, Mohammad; Nasiri, Sina

    2013-01-01

    Highlights: • Using Failure Modes and Effects Analysis (FMEA) to find potential failures in geothermal power plants. • We considered 5 major parts of geothermal power plants for risk analysis. • Risk Priority Number (RPN) is calculated for all failure modes. • Corrective actions are recommended to eliminate or decrease the risk of failure modes. - Abstract: Renewable energy plays a key role in the transition toward a low carbon economy and the provision of a secure supply of energy. Geothermal energy is a versatile source as a form of renewable energy that meets popular demand. Since some Geothermal Power Plants (GPPs) face various failures, the requirement of a technique for team engineering to eliminate or decrease potential failures is considerable. Because no specific published record of considering an FMEA applied to GPPs with common failure modes have been found already, in this paper, the utilization of Failure Modes and Effects Analysis (FMEA) as a convenient technique for determining, classifying and analyzing common failures in typical GPPs is considered. As a result, an appropriate risk scoring of occurrence, detection and severity of failure modes and computing the Risk Priority Number (RPN) for detecting high potential failures is achieved. In order to expedite accuracy and ability to analyze the process, XFMEA software is utilized. Moreover, 5 major parts of a GPP is studied to propose a suitable approach for developing GPPs and increasing reliability by recommending corrective actions for each failure mode

  19. Coherent network analysis technique for discriminating gravitational-wave bursts from instrumental noise

    International Nuclear Information System (INIS)

    Chatterji, Shourov; Lazzarini, Albert; Stein, Leo; Sutton, Patrick J.; Searle, Antony; Tinto, Massimo

    2006-01-01

    The sensitivity of current searches for gravitational-wave bursts is limited by non-Gaussian, nonstationary noise transients which are common in real detectors. Existing techniques for detecting gravitational-wave bursts assume the output of the detector network to be the sum of a stationary Gaussian noise process and a gravitational-wave signal. These techniques often fail in the presence of noise nonstationarities by incorrectly identifying such transients as possible gravitational-wave bursts. Furthermore, consistency tests currently used to try to eliminate these noise transients are not applicable to general networks of detectors with different orientations and noise spectra. In order to address this problem we introduce a fully coherent consistency test that is robust against noise nonstationarities and allows one to distinguish between gravitational-wave bursts and noise transients in general detector networks. This technique does not require any a priori knowledge of the putative burst waveform

  20. Transient signal analysis in power reactors by means of the wavelet technique

    International Nuclear Information System (INIS)

    Wentzeis, Luis

    1999-01-01

    The application of the wavelet technique, had enabled to study the time evolution of the properties (amplitude and frequency content) of a signals set, measured in the Embalse nuclear power plant (CANDU 600 M we), in the low frequency range and for different operating conditions. Particularly, by means of this technique, we studied the time evolution of the signals in the non-stationary state of the reactor (during a raise in power), where the Fourier analysis results inadequate. (author)