WorldWideScience

Sample records for analysis techniques identifies

  1. Evaluation of energy system analysis techniques for identifying underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C. [and others

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  2. [Research of Identify Spatial Object Using Spectrum Analysis Technique].

    Science.gov (United States)

    Song, Wei; Feng, Shi-qi; Shi, Jing; Xu, Rong; Wang, Gong-chang; Li, Bin-yu; Liu, Yu; Li, Shuang; Cao Rui; Cai, Hong-xing; Zhang, Xi-he; Tan, Yong

    2015-06-01

    The high precision scattering spectrum of spatial fragment with the minimum brightness of 4.2 and the resolution of 0.5 nm has been observed using spectrum detection technology on the ground. The obvious differences for different types of objects are obtained by the normalizing and discrete rate analysis of the spectral data. Each of normalized multi-frame scattering spectral line shape for rocket debris is identical. However, that is different for lapsed satellites. The discrete rate of the single frame spectrum of normalized space debris for rocket debris ranges from 0.978% to 3.067%, and the difference of oscillation and average value is small. The discrete rate for lapsed satellites ranges from 3.118 4% to 19.472 7%, and the difference of oscillation and average value relatively large. The reason is that the composition of rocket debris is single, while that of the lapsed satellites is complex. Therefore, the spectrum detection technology on the ground can be used to the classification of the spatial fragment. PMID:26601348

  3. System reliability analysis using dominant failure modes identified by selective searching technique

    International Nuclear Information System (INIS)

    The failure of a redundant structural system is often described by innumerable system failure modes such as combinations or sequences of local failures. An efficient approach is proposed to identify dominant failure modes in the space of random variables, and then perform system reliability analysis to compute the system failure probability. To identify dominant failure modes in the decreasing order of their contributions to the system failure probability, a new simulation-based selective searching technique is developed using a genetic algorithm. The system failure probability is computed by a multi-scale matrix-based system reliability (MSR) method. Lower-scale MSR analyses evaluate the probabilities of the identified failure modes and their statistical dependence. A higher-scale MSR analysis evaluates the system failure probability based on the results of the lower-scale analyses. Three illustrative examples demonstrate the efficiency and accuracy of the approach through comparison with existing methods and Monte Carlo simulations. The results show that the proposed method skillfully identifies the dominant failure modes, including those neglected by existing approaches. The multi-scale MSR method accurately evaluates the system failure probability with statistical dependence fully considered. The decoupling between the failure mode identification and the system reliability evaluation allows for effective applications to larger structural systems

  4. Practical implementation of the corrected force analysis technique to identify the structural parameter and load distributions

    Science.gov (United States)

    Leclère, Quentin; Ablitzer, Frédéric; Pézerat, Charles

    2015-09-01

    The paper aims to combine two objectives of the Force Analysis Technique (FAT): vibration source identification and material characterization from the same set of measurement. Initially, the FAT was developed for external load location and identification. It consists in injecting measured vibration displacements in the discretized equation of motion. Two developments exist: FAT and CFAT (Corrected Force Analysis Technique) where two finite difference schemes are used. Recently, the FAT was adapted for the identification of elastic and damping properties in a structure. The principal interests are that the identification is local and allows mapping of material characteristics, the identification can be made at all frequencies, especially in medium and high frequency domains. The paper recalls the development of FAT and CFAT on beams and plates and how it can be possible to extract material characteristics in areas where no external loads are applied. Experimental validations are shown on an aluminum plate with arbitrary boundary conditions, excited by a point force and where a piece of foam is glued on a sub-surface of the plate. Contactless measurements were made using a scanning laser vibrometer. The results of FAT and CFAT are compared and discussed for material property identifications in the regions with and without foam. The excitation force identification is finally made by using the identified material properties. CFAT gives excellent results comparable to a direct measurement obtained by a piezoelectric sensor. The relevance of the corrected scheme is then underlined for both source identification and material characterization from the same measurements.

  5. Social Learning Network Analysis Model to Identify Learning Patterns Using Ontology Clustering Techniques and Meaningful Learning

    Science.gov (United States)

    Firdausiah Mansur, Andi Besse; Yusof, Norazah

    2013-01-01

    Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…

  6. Neutron activation analysis techniques for identifying elemental status in Alzheimer's disease

    International Nuclear Information System (INIS)

    Brain tissue (hippocampus and cerebral cortex) from Alzheimer's disease and control individuals sampled from Eastern Canada and the United Kingdom were analyzed for Ag, Al, As, B, Br, Ca, Cd, Co, Cr, Cs, Cu, Fe, Hg, I, K, La, Mg, Mn, Mo, Ni, Rb, S, Sb, Sc, Se, Si, Sn, Sr, Ti, V and Zn. Neutron activation analysis (thermal and prompt gamma-ray) methods were used. Very highly significant differences (S**: probability less than 0.005) for both study areas were shown between Alzheimer's disease (AD) and control (C) individuals: AD>C for Al, Br, Ca and S, and AD< C for Se, V and Zn. Aluminium content of brain tissue ranged form 3.605 to 21.738 μg/g d.w. (AD) and 0.379 to 4.768 μg/g d.w. (C). No statistical evidence of aluminium accumulation with age was noted. Possible zinc deficiency (especially for hippocampal tissue), was observed with zinc ranges of 31.42 to 57.91 μg/g d.w. (AD) and 37.31 to 87.10 μg/g d.w. (C), for Alzheimer's disease patients. (author)

  7. Application of Principal Component Analysis to NIR Spectra of Phyllosilicates: A Technique for Identifying Phyllosilicates on Mars

    Science.gov (United States)

    Rampe, E. B.; Lanza, N. L.

    2012-01-01

    Orbital near-infrared (NIR) reflectance spectra of the martian surface from the OMEGA and CRISM instruments have identified a variety of phyllosilicates in Noachian terrains. The types of phyllosilicates present on Mars have important implications for the aqueous environments in which they formed, and, thus, for recognizing locales that may have been habitable. Current identifications of phyllosilicates from martian NIR data are based on the positions of spectral absorptions relative to laboratory data of well-characterized samples and from spectral ratios; however, some phyllosilicates can be difficult to distinguish from one another with these methods (i.e. illite vs. muscovite). Here we employ a multivariate statistical technique, principal component analysis (PCA), to differentiate between spectrally similar phyllosilicate minerals. PCA is commonly used in a variety of industries (pharmaceutical, agricultural, viticultural) to discriminate between samples. Previous work using PCA to analyze raw NIR reflectance data from mineral mixtures has shown that this is a viable technique for identifying mineral types, abundances, and particle sizes. Here, we evaluate PCA of second-derivative NIR reflectance data as a method for classifying phyllosilicates and test whether this method can be used to identify phyllosilicates on Mars.

  8. MALDI-TOF and SELDI-TOF analysis: “tandem” techniques to identify potential biomarker in fibromyalgia

    Directory of Open Access Journals (Sweden)

    A. Lucacchini

    2011-11-01

    Full Text Available Fibromyalgia (FM is characterized by the presence of chronic widespread pain throughout the musculoskeletal system and diffuse tenderness. Unfortunately, no laboratory tests have been appropriately validated for FM and correlated with the subsets and activity. The aim of this study was to apply a proteomic technique in saliva of FM patients: the Surface Enhance Laser Desorption/Ionization Time-of-Flight (SELDI-TOF. For this study, 57 FM patients and 35 HC patients were enrolled. The proteomic analysis of saliva was carried out using SELDI-TOF. The analysis was performed using different chip arrays with different characteristics of binding. The statistical analysis was performed using cluster analysis and the difference between two groups was underlined using Student’s t-test. Spectra analysis highlighted the presence of several peaks differently expressed in FM patients compared with controls. The preliminary results obtained by SELDI-TOF analysis were compared with those obtained in our previous study performed on whole saliva of FM patients by using electrophoresis. The m/z of two peaks, increased in FM patients, seem to overlap well with the molecular weight of calgranulin A and C and Rho GDP-dissociation inhibitor 2, which we had found up-regulated in our previous study. These preliminary results showed the possibility of identifying potential salivary biomarker through salivary proteomic analysis with MALDI-TOF and SELDI-TOF in FM patients. The peaks observed allow us to focus on some of the particular pathogenic aspects of FM, the oxidative stress which contradistinguishes this condition, the involvement of proteins related to the cytoskeletal arrangements, and central sensibilization.

  9. Application of gene network analysis techniques identifies AXIN1/PDIA2 and endoglin haplotypes associated with bicuspid aortic valve.

    Directory of Open Access Journals (Sweden)

    Eric C Wooten

    Full Text Available Bicuspid Aortic Valve (BAV is a highly heritable congenital heart defect. The low frequency of BAV (1% of general population limits our ability to perform genome-wide association studies. We present the application of four a priori SNP selection techniques, reducing the multiple-testing penalty by restricting analysis to SNPs relevant to BAV in a genome-wide SNP dataset from a cohort of 68 BAV probands and 830 control subjects. Two knowledge-based approaches, CANDID and STRING, were used to systematically identify BAV genes, and their SNPs, from the published literature, microarray expression studies and a genome scan. We additionally tested Functionally Interpolating SNPs (fitSNPs present on the array; the fourth consisted of SNPs selected by Random Forests, a machine learning approach. These approaches reduced the multiple testing penalty by lowering the fraction of the genome probed to 0.19% of the total, while increasing the likelihood of studying SNPs within relevant BAV genes and pathways. Three loci were identified by CANDID, STRING, and fitSNPS. A haplotype within the AXIN1-PDIA2 locus (p-value of 2.926x10(-06 and a haplotype within the Endoglin gene (p-value of 5.881x10(-04 were found to be strongly associated with BAV. The Random Forests approach identified a SNP on chromosome 3 in association with BAV (p-value 5.061x10(-06. The results presented here support an important role for genetic variants in BAV and provide support for additional studies in well-powered cohorts. Further, these studies demonstrate that leveraging existing expression and genomic data in the context of GWAS studies can identify biologically relevant genes and pathways associated with a congenital heart defect.

  10. Identifying Major Techniques of Persuasion.

    Science.gov (United States)

    Makosky, Vivian Parker

    1985-01-01

    The purpose of this class exercise is to increase undergraduate psychology students' awareness of common persuasion techniques used in advertising, including the appeal to or creation of needs, social and prestige suggestion, and the use of emotionally loaded words and images. Television commercials and magazine advertisements are used as…

  11. Identifying learning techniques among high achievers

    OpenAIRE

    Shanmukananda P; L. Padma

    2013-01-01

    Background: In every college, it is noticed that in spite of being exposed to the same teaching modalities and adopting seemingly similar strategies, some students perform much better than their peers. This can be evaluated in the form of better academic performance in the internal assessments they undertake. This project is an endeavor to identify the learning techniques among high achievers which they employ to outperform others. We can also suggest the same to the medium and low achievers ...

  12. Identifying learning techniques among high achievers

    Directory of Open Access Journals (Sweden)

    Shanmukananda P

    2013-04-01

    Full Text Available Background: In every college, it is noticed that in spite of being exposed to the same teaching modalities and adopting seemingly similar strategies, some students perform much better than their peers. This can be evaluated in the form of better academic performance in the internal assessments they undertake. This project is an endeavor to identify the learning techniques among high achievers which they employ to outperform others. We can also suggest the same to the medium and low achievers so that they can improve their academic performance. This study was conducted to identify different learning techniques adopted by high achievers and suggesting the same to medium and low achievers. Methods: After obtaining clearance from the institutional ethics committee, the high achievers were identified by selecting the upper third of the students in the ascending order of marks obtained in the consecutive three internal assessments in three consecutive batches. The identity of the students was not revealed. They were then administered an open ended questionnaire which addressed relevant issues. The most common and feasible techniques will be suggested to the medium and low achievers. Results: The respondents’ (n=101 replies were analyzed by calculating the percentages of responses, and assessing based on that, which were the most frequently adapted techniques by these high achievers Conclusions: High-achievers have a diligent study pattern; they not only study regularly, but also involve in group discussions and approach their teachers when in doubt. Additionally, they refer to other sources of information like the internet, demonstrating a proactive attitude towards studies. [Int J Basic Clin Pharmacol 2013; 2(2.000: 203-207

  13. Nuclear techniques to identify allergenic metals in orthodontic brackets

    International Nuclear Information System (INIS)

    The present study determines the elementary alloy composition of ten commercial brands of brackets, especially related to Ni, Cr, and Co metals, confirmed allergenic elements. The nuclear techniques applied in the analyses were X-ray fluorescence (XRF) - Centre National de la Recherche Scientifique, France (National Center of Scientific Research), and X-ray energy spectrometry (XRES), and Instrumental Neutron Activation Analysis (INAA) - CDTN/CNEN, Brazil. The XRES and XRF techniques identified Cr in the 10 samples analyzed and Ni in eight samples. The INAA technique identified the presence of Cr (14% to 19%) and Co (42% to 2400 ppm) in all samples. The semi-quantitative analysis performed by XRF also identified Co in two samples. The techniques were effective in the identification of metals in orthodontic brackets. The elements identified in this study can be considered one of the main reason for the allergic processes among the patients studied. This finding suggests that the patients should be tested for allergy and allergenic sensibility to metals prior to the prescription of orthodontic device. (author)

  14. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  15. Identifying Network Anomalies Using Clustering Technique in Weblog Data

    Directory of Open Access Journals (Sweden)

    A. Bhaskar

    2012-06-01

    Full Text Available In this paper we present an approach for identifying networkanomalies by visualizing network flow data which is stored inweblogs. Various clustering techniques can be used to identifydifferent anomalies in the network. Here, we present a newapproach based on simple K-Means for analyzing networkflow data using different attributes like IP address, Protocol,Port number etc. to detect anomalies. By using visualization,we can identify which sites are more frequently accessed bythe users. In our approach we provide overview about givendataset by studying network key parameters. In this processwe used preprocessing techniques to eliminate unwantedattributes from weblog data.

  16. Identifying appropriate tasks for the preregistration year: modified Delphi technique

    OpenAIRE

    Stewart, J.; O'Halloran, Cath; Harrigan, P; Spencer, J; Barton, J. R.; Singleton, S. J.

    1999-01-01

    Objectives: To identify the tasks that should constitute the work of preregistration house officers to provide the basis for the development of a self evaluation instrument. Design: Literature review and modified Delphi technique. Setting: Northern Deanery within the Northern and Yorkshire office NHS executive. Subjects: 67 educational supervisors of preregistration house officers. Main outcome measures: Percentage of agreement by educational supervisors to tasks identified fr...

  17. Application of thermoluminescence technique to identify radiation processed foods

    International Nuclear Information System (INIS)

    Research studies reported by various authors have shown that a few methods -one of which is thermoluminescence technique- may be suitable for identification of some certain irradiated spices and food containing bones. This study is an application of the thermoluminescence technique for identifying the irradiated samples. The investigation was carried out on different types of foodstuffs such as onions, potatoes and kiwi. Measurements show that the technique can be applied as a reliable method to distinguish the irradiated food products from non-irradiated ones. The results demonstrate also that it is possible to use this method for determining the absorbed dose of irradiated samples from the established dose-effect curve. (Author)

  18. Identifying Model Parameters of Semiconductor Devices Using Optimization Techniques

    OpenAIRE

    Hruškovič, Lubomir; Grabner, Martin; Dobeš, Josef

    2007-01-01

    The optimization is an indispensable tool for extracting the parameters of any complicated models. Hence, advanced optimization techniques are also necessary for identifying the model parameters of semiconductor devices because their current models are very sophisticated (especially the BJT and MOSFET ones). The equations of such models contain typically one hundred parameters. Therefore, the measurement and particularly identification of the full set of the model para...

  19. A sipping technique to identify leaking rods in RCCAs

    International Nuclear Information System (INIS)

    This paper describes an original method developed by CEA and EDF in order to identify leaking rods in RCCAs with silver-based metallic alloy as neutron absorber material. The process consists of an etching of the absorber surface by means of a slightly aggressive solution. This involves the dissolution of a small quantity of silver in the solution. After the test, solution is recovered and sampled, and silver is dosed. A sipping prototype has been assembled, and three on-site demonstration campaigns, with one during the planned shutdown period of the plant, have proved the good reliability of the method and the industrial feasibility of this new technique. (author). 6 figs

  20. A Plume Tracing, Source Identifying Technique for Mars Rovers

    Science.gov (United States)

    Banfield, Don; Lamb, Brian; Hovde, Chris; Ferrara, Tom

    2015-11-01

    We have developed and field-tested a technique to identify and characterize the source of an effluent plume (biogenic or otherwise) on Mars, using a slow-moving vehicle like a Mars Rover. The technique is based on terrestrial plume characterization methods (EPA Method 33a), and uses puff models of variable complexity to predict the plume behavior for a given source. The technique is developed assuming that a Mars Rover would be equipped with a high-performance eddy-sensing 3-D anemometer (e.g., a Martian Sonic Anemometer), as well as a fast-response tracer molecule-specific sensor (e.g., a TLS methane sensor). The platform is assumed to move only once a day, but have the ability to observe throughout the day and night. Data obtained from any one sol while the effluent plume meanders across the rover can be used to estimate the azimuth, range and strength of the source, but combining observations from multiple sols and locations is used to improve the estimate of the souce location and strength.We have conducted preliminary field tests using a Sonic Anemometer (Gill and Campbell) and fast-response methane sensors (LICOR and Picarro) on mobile platforms using both controlled and existing methane releases to prove our algorithm in simple terrain, and with varying atmospheric stability. We will discuss our results and the efficacy of our algorithm in real world conditions.

  1. Image Techniques for Identifying Sea-Ice Parameters

    Directory of Open Access Journals (Sweden)

    Qin Zhang

    2014-10-01

    Full Text Available The estimation of ice forces are critical to Dynamic Positioning (DP operations in Arctic waters. Ice conditions are important for the analysis of ice-structure interaction in an ice field. To monitor sea-ice conditions, cameras are used as field observation sensors on mobile sensor platforms in Arctic. Various image processing techniques, such as Otsu thresholding, k-means clustering, distance transform, Gradient Vector Flow (GVF Snake, mathematical morphology, are then applied to obtain ice concentration, ice types, and floe size distribution from sea-ice images to ensure safe operations of structures in ice covered regions. Those techniques yield acceptable results, and their effectiveness are demonstrated in case studies.

  2. DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Food scientists use standards and calibrations to relate the concentration of a compound of interest to the instrumental response. The techniques used include classical, single point, and inverse calibrations, as well as standard addition and internal standards. Several fundamental criteria -- sel...

  3. Automatically identifying scatter in fluorescence data using robust techniques

    DEFF Research Database (Denmark)

    Engelen, S.; Frosch, Stina; Hubert, M.

    2007-01-01

    complicates the analysis instead and contributes to model inadequacy. As such, scatter can be considered as an example of element-wise outliers. However, no straightforward method for identifying the scatter region can be found in the literature. In this paper an automatic scatter identification method is...... input data for three different PARAFAC methods. Firstly inserting missing values in the scatter regions are tested, secondly an interpolation of the scatter regions is performed and finally the scatter regions are down-weighted. These results show that the PARAFAC method to choose after scatter...

  4. Review on Identify Kin Relationship Technique in Image

    Directory of Open Access Journals (Sweden)

    Deepak M Ahire

    2015-06-01

    Full Text Available In this paper work Kin relationships are traditionally defined as ties based on blood . Kinship include lineal generational bonds like children, parents, grandparents, and great-grandparents, collateral bonds such as siblings, cousins, nieces and nephews, and aunts and uncles, and ties with in-laws. An often-made distinction is that between primary kin members of the families of origin and procreation and secondary kin other family members. The former refer to as “immediate family,” and the latter are generally labelled “extended family.” Marriage, as a principle of kinship, differs from blood in that it can be terminated. Here Proposing the technique to identify Kin relationship System or Kinship model by using face recognition technique splitting the face into subsets like forehead, eyes, nose, mouth, and cheek areas constitute through Gabor Features on available Real time Database. Given the potential for marital break-up, blood is recognized as the more important principle of kinship.

  5. Identifiability analysis in conceptual sewer modelling.

    Science.gov (United States)

    Kleidorfer, M; Leonhardt, G; Rauch, W

    2012-01-01

    For a sufficient calibration of an environmental model not only parameter sensitivity but also parameter identifiability is an important issue. In identifiability analysis it is possible to analyse whether changes in one parameter can be compensated by appropriate changes of the other ones within a given uncertainty range. Parameter identifiability is conditional to the information content of the calibration data and consequently conditional to a certain measurement layout (i.e. types of measurements, number and location of measurement sites, temporal resolution of measurements etc.). Hence the influence of number and location of measurement sites on the number of identifiable parameters can be investigated. In the present study identifiability analysis is applied to a conceptual model of a combined sewer system aiming to predict the combined sewer overflow emissions. Different measurement layouts are tested and it can be shown that only 13 of the most sensitive catchment areas (represented by the model parameter 'effective impervious area') can be identified when overflow measurements of the 20 highest overflows and the runoff to the waste water treatment plant are used for calibration. The main advantage of this method is very low computational costs as the number of required model runs equals the total number of model parameters. Hence, this method is a valuable tool when analysing large models with a long runtime and many parameters. PMID:22864432

  6. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  7. Efficiency of different techniques to identify changes in land use

    Science.gov (United States)

    Zornoza, Raúl; Mateix-Solera, Jorge; Gerrero, César

    2013-04-01

    The need for the development of sensitive and efficient methodologies for soil quality evaluation is increasing. The ability to assess soil quality and identify key soil properties that serve as indicators of soil function is complicated by the multiplicity of physical, chemical and biological factors that control soil processes. In the mountain region of the Mediterranean Basin of Spain, almond trees have been cultivated in terraced orchards for centuries. These crops are immersed in the Mediterranean forest scenery, configuring a mosaic landscape where orchards are integrated in the forest masses. In the last decades, almond orchards are being abandoned, leading to an increase in vegetation cover, since abandoned fields are naturally colonized by the surrounded natural vegetation. Soil processes and properties are expected to be associated with vegetation successional dynamics. Thus, the establishment of suitable parameters to monitor soil quality related to land use changes is particularly important to guarantee the regeneration of the mature community. In this study, we selected three land uses, constituted by forest, almond trees orchards, and orchards abandoned between 10 and 15 years previously to sampling. Sampling was carried out in four different locations in SE Spain. The main purpose was to evaluate if changes in management have significantly influenced different sets of soil characteristics. For this purpose, we used a discriminant analysis (DA). The different sets of soil characteristics tested in this study were 1: physical, chemical and biochemical properties; 2: soil near infrared (NIR) spectra; and 3: phospholipid fatty acids (PLFAs). After the DA performed with the sets 1 and 2, the three land uses were clearly separated by the two first discriminant functions, and more than 85 % of the samples were correctly classified (grouped). Using the sets 3 and 4 for DA resulted in a slightly better separation of land uses, being more than 85% of the

  8. Digital Fourier analysis advanced techniques

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to advanced digital Fourier analysis for advanced undergraduate and graduate students. Assuming knowledge of the Fast Fourier Transform, this book covers advanced topics including the Hilbert transform, cepstrum analysis, and the two-dimensional Fourier transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Advanced Techniques" includes practice problems and thorough Appendices. As a central feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. The applet source code in Visual Basic is provided online, enabling advanced students to tweak and change the programs for more sophisticated results. A complete, intuitive guide, "Digital Fourier Analysis - Advanced Techniques" is an essential reference for students in science and engineering.

  9. A first countercheck trial to identify irradiated spices with luminescence techniques

    International Nuclear Information System (INIS)

    The Federal Health Office, in collaboration with 4 institutions responsible for food control analysis and 2 research facilities, for the first time conducted a countercheck trial to identify irradiated spices. This test was mainly intended to find out whether chemiluminescence (CL) and thermoluminescence (TL) techniques are appropriate for this purpose. Nine different spices were selected. Approximately 85% of the samples may be identified correctly if both methods are used. However, only 3% of all spices subjected to CL analysis were falsely identified as irradiated, and the proportion of falsely positive results is even lower than 1% if only those spices are considered for which CL analysis is well suited. If an additional procedure, such as TL, is applied, it is highly probable that false identification of non-irradiated samples may be excluded. (orig./PW)

  10. Comparison of different techniques to identify similar strains of pseudomonas

    Czech Academy of Sciences Publication Activity Database

    Kubesová, Anna; Horká, Marie; Horký, J.; Matoušková, H.; Šlais, Karel

    2011. P1-G-206-TU. ISBN 978-963-89335-0-8. [International Symposium on High-Performance Liquid Phase Separations and Related Techniques /36./. 19.06.2011-23.06.2011, Budapest] R&D Projects: GA AV ČR IAAX00310701 Institutional research plan: CEZ:AV0Z40310501 Keywords : microbial strains * MALDI-TOF * gel IEF Subject RIV: CB - Analytical Chemistry, Separation

  11. A saltwater flotation technique to identify unincubated eggs

    Science.gov (United States)

    Devney, C.A.; Kondrad, S.L.; Stebbins, K.R.; Brittingham, K.D.; Hoffman, D.J.; Heinz, G.H.

    2009-01-01

    Field studies on nesting birds sometimes involve questions related to nest initiation dates, length of the incubation period, or changes in parental incubation behavior during various stages of incubation. Some of this information can be best assessed when a nest is discovered before the eggs have undergone any incubation, and this has traditionally been assessed by floating eggs in freshwater. However, because the freshwater method is not particularly accurate in identifying unincubated eggs, we developed a more reliable saltwater flotation method. The saltwater method involves diluting a saturated saltwater solution with freshwater until a salt concentration is reached where unincubated eggs sink to the bottom and incubated eggs float to the surface. For Laughing Gulls (Leucophaeus atricilla), floating eggs in freshwater failed to identify 39.0% (N = 251) of eggs that were subsequently found by candling to have undergone incubation prior to collection. By contrast, in a separate collection of gull eggs, no eggs that passed the saltwater test (N = 225) were found by a later candling to have been incubated prior to collection. For Double-crested Cormorants (Phalacrocorax auritus), floating eggs in freshwater failed to identify 15.6% (N = 250) of eggs that had undergone incubation prior to collection, whereas in a separate collection, none of the eggs that passed the saltwater test (N = 85) were found by a later candling to have been incubated prior to collection. Immersion of eggs in saltwater did not affect embryo survival. Although use of the saltwater method is likely limited to colonial species and requires calibrating a saltwater solution, it is a faster and more accurate method of identifying unincubated eggs than the traditional method of floating eggs in freshwater.

  12. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  13. Identifying irradiated flours by photo-stimulated luminescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Ramli, Ros Anita Ahmad; Yasir, Muhamad Samudi [Faculty of Science and Technology, National University of Malaysia, Bangi, 43000 Kajang, Selangor (Malaysia); Othman, Zainon; Abdullah, Wan Saffiey Wan [Malaysian Nuclear Agency, Bangi 43000 Kajang, Selangor (Malaysia)

    2014-02-12

    Photo-stimulated luminescence (PSL) technique was used in this study to detect gamma irradiation treatment of five types of flours (corn, rice, tapioca, wheat and glutinous rice) at four different doses 0, 0.2, .05 and 1kGy. The signal level was compared with two threshold values (700 and 5000). With the exception of glutinous rice, all irradiated samples produced a strong signal above the upper threshold (5000 counts/60s). All control samples produced negative result with the signals below the lower threshold (700 counts/60s) suggesting that the samples have not been irradiated. Irradiated glutinous rice samples produced intermediate signals (700 - 5000 counts/60s) which were subsequently confirmed using calibrated PSL. The PSL signals remained stable after 90 days of storage. The findings of this study will be useful to facilitate control of food irradiation application in Malaysia.

  14. Identifying irradiated flours by photo-stimulated luminescence technique

    Science.gov (United States)

    Ramli, Ros Anita Ahmad; Yasir, Muhamad Samudi; Othman, Zainon; Abdullah, Wan Saffiey Wan

    2014-02-01

    Photo-stimulated luminescence (PSL) technique was used in this study to detect gamma irradiation treatment of five types of flours (corn, rice, tapioca, wheat and glutinous rice) at four different doses 0, 0.2, .05 and 1kGy. The signal level was compared with two threshold values (700 and 5000). With the exception of glutinous rice, all irradiated samples produced a strong signal above the upper threshold (5000 counts/60s). All control samples produced negative result with the signals below the lower threshold (700 counts/60s) suggesting that the samples have not been irradiated. Irradiated glutinous rice samples produced intermediate signals (700 - 5000 counts/60s) which were subsequently confirmed using calibrated PSL. The PSL signals remained stable after 90 days of storage. The findings of this study will be useful to facilitate control of food irradiation application in Malaysia.

  15. Identifying irradiated flours by photo-stimulated luminescence technique

    International Nuclear Information System (INIS)

    Photo-stimulated luminescence (PSL) technique was used in this study to detect gamma irradiation treatment of five types of flours (corn, rice, tapioca, wheat and glutinous rice) at four different doses 0, 0.2, .05 and 1kGy. The signal level was compared with two threshold values (700 and 5000). With the exception of glutinous rice, all irradiated samples produced a strong signal above the upper threshold (5000 counts/60s). All control samples produced negative result with the signals below the lower threshold (700 counts/60s) suggesting that the samples have not been irradiated. Irradiated glutinous rice samples produced intermediate signals (700 - 5000 counts/60s) which were subsequently confirmed using calibrated PSL. The PSL signals remained stable after 90 days of storage. The findings of this study will be useful to facilitate control of food irradiation application in Malaysia

  16. The application of data mining techniques in analysis the stock portfolio in order to identify common patterns in the behavior of shareholders (Case study of selected brokers in Mazandaran province)

    OpenAIRE

    GHASEM SOLTANLO, Sara Saadati; SADRABADI, Alireza Naser

    2015-01-01

    In this study, we examined the analysis of stock portfolio stakeholders in order to identify common patterns in the behavior of shareholders. Information required about the shareholders shopping cart was collected from the selected brokers in Mazandaran province / Sari city. This information includes demographic information, such as gender, occupation and education, along with a basket of shares purchased during 2013. Data were collected for 150 shares that during this period have traded at l...

  17. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  18. Lidar point density analysis: implications for identifying water bodies

    Science.gov (United States)

    Worstell, Bruce B.; Poppenga, Sandra; Evans, Gayla A.; Prince, Sandra

    2014-01-01

    Most airborne topographic light detection and ranging (lidar) systems operate within the near-infrared spectrum. Laser pulses from these systems frequently are absorbed by water and therefore do not generate reflected returns on water bodies in the resulting void regions within the lidar point cloud. Thus, an analysis of lidar voids has implications for identifying water bodies. Data analysis techniques to detect reduced lidar return densities were evaluated for test sites in Blackhawk County, Iowa, and Beltrami County, Minnesota, to delineate contiguous areas that have few or no lidar returns. Results from this study indicated a 5-meter radius moving window with fewer than 23 returns (28 percent of the moving window) was sufficient for delineating void regions. Techniques to provide elevation values for void regions to flatten water features and to force channel flow in the downstream direction also are presented.

  19. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  20. Identifying marker typing incompatibilities in linkage analysis.

    OpenAIRE

    Stringham, H M; Boehnke, M.

    1996-01-01

    A common problem encountered in linkage analyses is that execution of the computer program is halted because of genotypes in the data that are inconsistent with Mendelian inheritance. Such inconsistencies may arise because of pedigree errors or errors in typing. In some cases, the source of the inconsistencies is easily identified by examining the pedigree. In others, the error is not obvious, and substantial time and effort are required to identify the responsible genotypes. We have develope...

  1. Probabilistic risk assessment techniques help in identifying optimal equipment design for in-situ vitrification

    International Nuclear Information System (INIS)

    The analysis discussed in this paper was performed as part of the buried waste remediation efforts at the Idaho National Engineering Laboratory (INEL). The specific type of remediation discussed herein involves a thermal treatment process for converting contaminated soil and waste into a stable, chemically-inert form. Models of the proposed process were developed using probabilistic risk assessment (PRA) fault tree and event tree modeling techniques. The models were used to determine the appropriateness of the conceptual design by identifying potential hazards of system operations. Additional models were developed to represent the reliability aspects of the system components. By performing various sensitivities with the models, optimal design modifications are being identified to substantiate an integrated, cost-effective design representing minimal risk to the environment and/or public with maximum component reliability. 4 figs

  2. Using factor analysis to identify neuromuscular synergies during treadmill walking

    Science.gov (United States)

    Merkle, L. A.; Layne, C. S.; Bloomberg, J. J.; Zhang, J. J.

    1998-01-01

    Neuroscientists are often interested in grouping variables to facilitate understanding of a particular phenomenon. Factor analysis is a powerful statistical technique that groups variables into conceptually meaningful clusters, but remains underutilized by neuroscience researchers presumably due to its complicated concepts and procedures. This paper illustrates an application of factor analysis to identify coordinated patterns of whole-body muscle activation during treadmill walking. Ten male subjects walked on a treadmill (6.4 km/h) for 20 s during which surface electromyographic (EMG) activity was obtained from the left side sternocleidomastoid, neck extensors, erector spinae, and right side biceps femoris, rectus femoris, tibialis anterior, and medial gastrocnemius. Factor analysis revealed 65% of the variance of seven muscles sampled aligned with two orthogonal factors, labeled 'transition control' and 'loading'. These two factors describe coordinated patterns of muscular activity across body segments that would not be evident by evaluating individual muscle patterns. The results show that factor analysis can be effectively used to explore relationships among muscle patterns across all body segments to increase understanding of the complex coordination necessary for smooth and efficient locomotion. We encourage neuroscientists to consider using factor analysis to identify coordinated patterns of neuromuscular activation that would be obscured using more traditional EMG analyses.

  3. Identifiable Data Files - Medicare Provider Analysis and ...

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Provider Analysis and Review (MEDPAR) File contains data from claims for services provided to beneficiaries admitted to Medicare certified inpatient...

  4. Identifying MMORPG Bots: A Traffic Analysis Approach

    Directory of Open Access Journals (Sweden)

    Wen-Chin Chen

    2008-11-01

    Full Text Available Massively multiplayer online role playing games (MMORPGs have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1 the regularity in the release time of client commands, 2 the trend and magnitude of traffic burstiness in multiple time scales, and 3 the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  5. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  6. Análise comparativa de fragmentos identificáveis de forrageiras, pela técnica micro-histológica Comparative analysis of identifiable fragments of forages, by the microhistological technique

    Directory of Open Access Journals (Sweden)

    Maristela de Oliveira Bauer

    2005-12-01

    Full Text Available Objetivou-se, com este trabalho, verificar, pela técnica micro-histológica, diferenças entre espécies forrageiras quanto ao percentual de fragmentos identificáveis, em função do processo digestivo e da época do ano. Lâminas foliares frescas recém-expandidas, correspondentes à última e à penúltima posição no perfilho, das espécies Melinis minutiflora Pal. de Beauv (capim-gordura, Hyparrhenia rufa (Nees Stapf. (capim-jaraguá, Brachiaria decumbens Stapf. (capim-braquiária, Imperata brasiliensis Trin. (capim-sapé, de Medicago sativa L. (alfafa e de Schinus terebenthifolius Raddi (aroeira, amostradas nos períodos chuvoso e seco, foram digeridas in vitro e preparadas de acordo com a técnica micro-histológica. Observou-se que as espécies apresentaram diferenças marcantes na porcentagem de fragmentos identificáveis e que a digestão alterou estas porcentagens em torno de 10 %; que o período de amos­tragem não influenciou a porcentagem de fragmentos identificáveis para a maioria das espécies; que a presença de pigmentos e a adesão da epiderme às células dos tecidos internos da folha prejudicaram a identificação dos fragmentos; e que a digestão melhorou a visualização dos fragmentos dos capins sapé e jaraguá e da aroeira, mas prejudicou a do capim-braquiária e, principalmente, a da alfafa.The objetive of this study was to verify differences among forages species in relation to the percentage of identifiable fragment as affected by the digestion process and season. Fresh last expanded leaf lamina samples of the species Melinis minutiflora Pal. de Beauv (Molassesgrass, Hyparrhenia rufa (Nees Stapf. (Jaraguagrass, Brachiaria decumbens Stapf. (Signalgrass, Imperata brasilienses Trin. (Sapegrass, and foliar laminas of Medicago sativa L. (Alfalfa and Schinus terebenthifolius Raddi (Aroeira, sampled in the rainy and dry seasons, were digested in vitro and prepared according to the microhistological technique. The

  7. Identifying the sources of produced water in the oil field by isotopic techniques

    International Nuclear Information System (INIS)

    The objective of this study is to identify the sources of the formation water in the Southwest Su Tu Den (STD SW) basement reservoir. To achieve the objective, isotopic techniques along with geochemical analysis for chloride, bromide, strontium dissolved in the water were applied. The isotopic techniques used in this study were the determination of water stable isotopes signatures (δ2H and δ18O) and of the 87Sr/86Sr ratio of strontium in rock cutting sample and that dissolved in the formation water. The obtained results showed that the stable isotopes compositions of water in the Lower Miocene was -3‰ and -23‰ for δ18O and δ2H, respectively indicating the primeval nature of seawater in the reservoir. Meanwhile, the isotopic composition of water in the basement was clustered in a range of alternated freshwater with δ18O and δ2H being -(3-4)‰ and -(54-60)‰, respectively). The strontium isotopes ratio for water in the Lower Miocene reservoir was lower compared to that for water in the basement confirming the different natures of the water in the two reservoirs. The obtained results are assured for the techniques applicability, and it is recommended that studies on identification of the flow-path of the formation water in the STD SW basement reservoir should be continued. (author)

  8. Vector Autoregresive Techniques for Structural Analysis Vector Autoregresive Techniques for Structural Analysis

    Directory of Open Access Journals (Sweden)

    Paul L. Fackler

    1988-03-01

    Full Text Available Vector Autoregresive Techniques for Structural Analysis Vector Autoregressive (VAR] models which do not rely on a recursive model srtructure are discussed. Linkages to traditional dynamic simultaneous equations models are developed which emphasize the nature of the identifying restrictions that characterize VAR models. Explicit expressions for the Score and Informtion functions are derived and their role in model identification, estimation and hypothesis testing is discussed.

  9. Adhesive polypeptides of Staphylococcus aureus identified using a novel secretion library technique in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Holm Liisa

    2011-05-01

    Full Text Available Abstract Background Bacterial adhesive proteins, called adhesins, are frequently the decisive factor in initiation of a bacterial infection. Characterization of such molecules is crucial for the understanding of bacterial pathogenesis, design of vaccines and development of antibacterial drugs. Because adhesins are frequently difficult to express, their characterization has often been hampered. Alternative expression methods developed for the analysis of adhesins, e.g. surface display techniques, suffer from various drawbacks and reports on high-level extracellular secretion of heterologous proteins in Gram-negative bacteria are scarce. These expression techniques are currently a field of active research. The purpose of the current study was to construct a convenient, new technique for identification of unknown bacterial adhesive polypeptides directly from the growth medium of the Escherichia coli host and to identify novel proteinaceous adhesins of the model organism Staphylococcus aureus. Results Randomly fragmented chromosomal DNA of S. aureus was cloned into a unique restriction site of our expression vector, which facilitates secretion of foreign FLAG-tagged polypeptides into the growth medium of E. coli ΔfliCΔfliD, to generate a library of 1663 clones expressing FLAG-tagged polypeptides. Sequence and bioinformatics analyses showed that in our example, the library covered approximately 32% of the S. aureus proteome. Polypeptides from the growth medium of the library clones were screened for binding to a selection of S. aureus target molecules and adhesive fragments of known staphylococcal adhesins (e.g coagulase and fibronectin-binding protein A as well as polypeptides of novel function (e.g. a universal stress protein and phosphoribosylamino-imidazole carboxylase ATPase subunit were detected. The results were further validated using purified His-tagged recombinant proteins of the corresponding fragments in enzyme-linked immunoassay and

  10. Use of discriminant analysis to identify propensity for purchasing properties

    Directory of Open Access Journals (Sweden)

    Ricardo Floriani

    2015-03-01

    Full Text Available Properties usually represent a milestone for people and families due to the high added-value when compared with family income. The objective of this study is the proposition of a discrimination model, by a discriminant analysis of people with characteristics (according to independent variables classified as potential buyers of properties, as well as to identify the interest in the use of such property, if it will be assigned to housing or leisure activities such as a cottage or beach house, and/or for investment. Thus, the following research question is proposed: What are the characteristics that better describe the profile of people which intend to acquire properties? The study justifies itself by its economic relevance in the real estate industry, as well as to the players of the real estate Market that may develop products based on the profile of potential customers. As a statistical technique, discriminant analysis was applied to the data gathered by questionnaire, which was sent via e-mail. Three hundred and thirty four responses were gathered. Based on this study, it was observed that it is possible to identify the intention for acquired properties, as well the purpose for acquiring it, for housing or investments.

  11. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  12. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  13. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  14. Time-Action Analysis (TAA) of the Surgical Technique Implanting the Collum Femoris Preserving (CFP) Hip Arthroplasty. TAASTIC trial Identifying pitfalls during the learning curve of surgeons participating in a subsequent randomized controlled trial (An observational study)

    OpenAIRE

    Runne Wouter C; Bhandari Mohit; Schafroth Matthias U; van Oldenrijk Jakob; Poolman Rudolf W

    2008-01-01

    Abstract Background Two types of methods are used to assess learning curves: outcome assessment and process assessment. Outcome measures are usually dichotomous rare events like complication rates and survival or require an extensive follow-up and are therefore often inadequate to monitor individual learning curves. Time-action analysis (TAA) is a tool to objectively determine the level of efficiency of individual steps of a surgical procedure. Methods/Design We are currently using TAA to det...

  15. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  16. Modera techniques of trend analysis and interpolation

    Directory of Open Access Journals (Sweden)

    L. TORELLI

    1975-05-01

    Full Text Available This article contains a schematic exposition of t h e theoretical framework on which recent techniques of trend analysis and interpolation rest. I t is shown t h a t such techniques consist in the joint application of Analysis of the Variance and of Multivariate Distribution Analysis. The t h e o r y of Universal Kriging by G. Matheron is also discussed and reduced t o t h e above theories.

  17. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  18. Using Linguistic Information and Machine Learning Techniques to Identify Entities from Juridical Documents

    OpenAIRE

    Gonçalves, Teresa; Quaresma, Paulo

    2010-01-01

    Information extraction from legal documents is an important and open problem. A mixed approach, using linguistic information and machine learning techniques, is described in this paper. In this approach, top-level legal concepts are identified and used for document classifica- tion using Support Vector Machines. Named entities, such as, locations, organizations, dates, and document references, are identified using se- mantic information from the output of a natural language parser. This infor...

  19. Simplified Microarray Technique for Identifying mRNA in Rare Samples

    Science.gov (United States)

    Almeida, Eduardo; Kadambi, Geeta

    2007-01-01

    Two simplified methods of identifying messenger ribonucleic acid (mRNA), and compact, low-power apparatuses to implement the methods, are at the proof-of-concept stage of development. These methods are related to traditional methods based on hybridization of nucleic acid, but whereas the traditional methods must be practiced in laboratory settings, these methods could be practiced in field settings. Hybridization of nucleic acid is a powerful technique for detection of specific complementary nucleic acid sequences, and is increasingly being used for detection of changes in gene expression in microarrays containing thousands of gene probes. A traditional microarray study entails at least the following six steps: 1. Purification of cellular RNA, 2. Amplification of complementary deoxyribonucleic acid [cDNA] by polymerase chain reaction (PCR), 3. Labeling of cDNA with fluorophores of Cy3 (a green cyanine dye) and Cy5 (a red cyanine dye), 4. Hybridization to a microarray chip, 5. Fluorescence scanning the array(s) with dual excitation wavelengths, and 6. Analysis of the resulting images. This six-step procedure must be performed in a laboratory because it requires bulky equipment.

  20. Time-Action Analysis (TAA of the Surgical Technique Implanting the Collum Femoris Preserving (CFP Hip Arthroplasty. TAASTIC trial Identifying pitfalls during the learning curve of surgeons participating in a subsequent randomized controlled trial (An observational study

    Directory of Open Access Journals (Sweden)

    Runne Wouter C

    2008-06-01

    Full Text Available Abstract Background Two types of methods are used to assess learning curves: outcome assessment and process assessment. Outcome measures are usually dichotomous rare events like complication rates and survival or require an extensive follow-up and are therefore often inadequate to monitor individual learning curves. Time-action analysis (TAA is a tool to objectively determine the level of efficiency of individual steps of a surgical procedure. Methods/Design We are currently using TAA to determine the number of cases needed for surgeons to reach proficiency with a new innovative hip implant prior to initiating a multicentre RCT. By analysing the unedited video recordings of the first 20 procedures of each surgeon the number and duration of the actions needed for a surgeon to achieve his goal and the efficiency of these actions is measured. We constructed a taxonomy or list of actions which together describe the complete surgical procedure. In the taxonomy we categorised the procedure in 5 different Goal Oriented Phases (GOP: 1. the incision phase 2. the femoral phase 3. the acetabulum phase 4. the stem phase 5. the closure pase Each GOP was subdivided in Goal Oriented Actions (GOA and each GOA is subdivided in Separate Actions (SA thereby defining all the necessary actions to complete the procedure. We grouped the SAs into GOAs since it would not be feasible to measure each SA. Using the video recordings, the duration of each GOA was recorded as well as the amount of delay. Delay consists of repetitions, waiting and additional actions. The nett GOA time is the total GOA time – delay and is a representation of the level of difficulty of each procedure. Efficiency is the percentage of nett GOA time during each procedure. Discussion This allows the construction of individual learning curves, assessment of the final skill level for each surgeon and comparison of different surgeons prior to participation in an RCT. We believe an objective and

  1. Event tree analysis using artificial intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs.

  2. Poof Analysis: A technique for Concept Formation

    OpenAIRE

    Bundy, Alan

    1985-01-01

    We report the discovery of an unexpected connection between the invention of the concept of uniform convergence and the occurs check in the unification algorithm. This discovery suggests the invention of further interesting concepts in analysis and a technique for automated concept formation. Part of this technique has been implemented.The discovery arose as part of an attempt to understand the role of proof analysis in mathematical reasoning, so as to incorporate it into a computer program. ...

  3. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  4. Nuclear analysis techniques and environmental science

    International Nuclear Information System (INIS)

    The features, developing trends and some frontier topics of nuclear analysis techniques and their applications in environmental science were reviewed, including the study of chemical speciation and environmental toxicology, microanalysis and identification of atmospheric particle, nuclear analysis methodology with high accuracy and quality assurance of environmental monitoring, super-sensitive nuclear analysis method and addiction of toxicant with DNA, environmental specimen banking at nuclear analysis centre and biological environmental monitor, etc

  5. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    Science.gov (United States)

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  6. Identifying redundancy and exposing provenance in crowdsourced data analysis.

    Science.gov (United States)

    Willett, Wesley; Ginosar, Shiry; Steinitz, Avital; Hartmann, Björn; Agrawala, Maneesh

    2013-12-01

    We present a system that lets analysts use paid crowd workers to explore data sets and helps analysts interactively examine and build upon workers' insights. We take advantage of the fact that, for many types of data, independent crowd workers can readily perform basic analysis tasks like examining views and generating explanations for trends and patterns. However, workers operating in parallel can often generate redundant explanations. Moreover, because workers have different competencies and domain knowledge, some responses are likely to be more plausible than others. To efficiently utilize the crowd's work, analysts must be able to quickly identify and consolidate redundant responses and determine which explanations are the most plausible. In this paper, we demonstrate several crowd-assisted techniques to help analysts make better use of crowdsourced explanations: (1) We explore crowd-assisted strategies that utilize multiple workers to detect redundant explanations. We introduce color clustering with representative selection--a strategy in which multiple workers cluster explanations and we automatically select the most-representative result--and show that it generates clusterings that are as good as those produced by experts. (2) We capture explanation provenance by introducing highlighting tasks and capturing workers' browsing behavior via an embedded web browser, and refine that provenance information via source-review tasks. We expose this information in an explanation-management interface that allows analysts to interactively filter and sort responses, select the most plausible explanations, and decide which to explore further. PMID:24051786

  7. Efficacy of fractal analysis in identifying glaucomatous damage

    Science.gov (United States)

    Kim, P. Y.; Iftekharuddin, K. M.; Gunvant, P.; Tóth, M.; Holló, G.; Essock, E. A.

    2010-02-01

    In this work, we propose a novel fractal-based technique to analyze pseudo 2D representation of 1D retinal nerve fiber layer (RNFL) thickness measurement data vector set for early detection of glaucoma. In our proposed technique, we first convert the 1D RNFL data vector sets into pseudo 2D images and then exploit 2D fractal analysis (FA) technique to obtain the representative features. These 2D fractal-based features are further processed using principal component analysis (PCA) and the final classification between normal and glaucomatous eyes is obtained using Fischer's linear discriminant analysis (LDA). An independent dataset is used for training and testing the classifier. The technique is used on randomly selected GDx variable corneal compensator (VCC) eye data from 227 study participants (116 patients with glaucoma and 111 patients with healthy eyes). We compute sensitivity, specificity and area under receiver operating curve (AUROC) for statistical performance comparison with other known techniques. Our classification performance shows that fractal-based technique is superior to the standard machine classifier Nerve Fiber Indicator (NFI).

  8. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  9. A Visualization System Using Data Mining Techniques for Identifying Information Sources.

    Science.gov (United States)

    Fowler, Richard H.; Karadayi, Tarkan; Chen, Zhixiang; Meng, Xiannong; Fowler, Wendy A. Lawrence

    The Visual Analysis System (VAS) was developed to couple emerging successes in data mining with information visualization techniques in order to create a richly interactive environment for information retrieval from the World Wide Web. VAS's retrieval strategy operates by first using a conventional search engine to form a core set of retrieved…

  10. Molecular technique identifies the pathogen responsible for culture negative infective endocarditis

    OpenAIRE

    SHIN, G. Y.; Manuel, R J; Ghori, S; Brecker, S; Breathnach, A. S.

    2005-01-01

    A case of culture negative endocarditis complicated by immune complex glomerulonephritis and severe aortic regurgitation necessitated aortic valve replacement. Empirical treatment with penicillin and gentamicin according to UK guidelines was started. The pathogen, Streptococcus sanguis, was later identified by polymerase chain reaction amplification and sequencing of bacterial 16S ribosomal RNA. This molecular technique is likely to be of increasing importance in determining the aetiology of ...

  11. Analysis and comparation of animation techniques

    OpenAIRE

    Joštová, Barbora

    2015-01-01

    This thesis is focused on the analysis and comparison of animation techniques. In the theoretical part of the thesis I define key terms, the historical development and the basic principles of animation techniques. In the practical part I describe the comparison between classic and digital types of animation. Based on this research I chose the most suitable animations that are further used to verify my hypothesis. The provided hypothesis is the order of based on how demanding it is in terms of...

  12. Clustering Analysis within Text Classification Techniques

    OpenAIRE

    Madalina ZURINI; Catalin SBORA

    2011-01-01

    The paper represents a personal approach upon the main applications of classification which are presented in the area of knowledge based society by means of methods and techniques widely spread in the literature. Text classification is underlined in chapter two where the main techniques used are described, along with an integrated taxonomy. The transition is made through the concept of spatial representation. Having the elementary elements of geometry and the artificial intelligence analysis,...

  13. Equivalent Dynamic Stiffness Mapping technique for identifying nonlinear structural elements from frequency response functions

    Science.gov (United States)

    Wang, X.; Zheng, G. T.

    2016-02-01

    A simple and general Equivalent Dynamic Stiffness Mapping technique is proposed for identifying the parameters or the mathematical model of a nonlinear structural element with steady-state primary harmonic frequency response functions (FRFs). The Equivalent Dynamic Stiffness is defined as the complex ratio between the internal force and the displacement response of unknown element. Obtained with the test data of responses' frequencies and amplitudes, the real and imaginary part of Equivalent Dynamic Stiffness are plotted as discrete points in a three dimensional space over the displacement amplitude and the frequency, which are called the real and the imaginary Equivalent Dynamic Stiffness map, respectively. These points will form a repeatable surface as the Equivalent Dynamic stiffness is only a function of the corresponding data as derived in the paper. The mathematical model of the unknown element can then be obtained by surface-fitting these points with special functions selected by priori knowledge of the nonlinear type or with ordinary polynomials if the type of nonlinearity is not pre-known. An important merit of this technique is its capability of dealing with strong nonlinearities owning complicated frequency response behaviors such as jumps and breaks in resonance curves. In addition, this technique could also greatly simplify the test procedure. Besides there is no need to pre-identify the underlying linear parameters, the method uses the measured data of excitation forces and responses without requiring a strict control of the excitation force during the test. The proposed technique is demonstrated and validated with four classical single-degree-of-freedom (SDOF) numerical examples and one experimental example. An application of this technique for identification of nonlinearity from multiple-degree-of-freedom (MDOF) systems is also illustrated.

  14. Gold analysis by the gamma absorption technique.

    Science.gov (United States)

    Kurtoglu, Arzu; Tugrul, A Beril

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement. PMID:12485656

  15. Comparison of remote sensing image processing techniques to identify tornado damage areas from Landsat TM data

    Science.gov (United States)

    Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.P.

    2008-01-01

    Remote sensing techniques have been shown effective for large-scale damage surveys after a hazardous event in both near real-time or post-event analyses. The paper aims to compare accuracy of common imaging processing techniques to detect tornado damage tracks from Landsat TM data. We employed the direct change detection approach using two sets of images acquired before and after the tornado event to produce a principal component composite images and a set of image difference bands. Techniques in the comparison include supervised classification, unsupervised classification, and objectoriented classification approach with a nearest neighbor classifier. Accuracy assessment is based on Kappa coefficient calculated from error matrices which cross tabulate correctly identified cells on the TM image and commission and omission errors in the result. Overall, the Object-oriented Approach exhibits the highest degree of accuracy in tornado damage detection. PCA and Image Differencing methods show comparable outcomes. While selected PCs can improve detection accuracy 5 to 10%, the Object-oriented Approach performs significantly better with 15-20% higher accuracy than the other two techniques. ?? 2008 by MDPI.

  16. Using text-mining techniques in electronic patient records to identify ADRs from medicine use

    DEFF Research Database (Denmark)

    Warrer, Pernille; Hansen, Ebba Holme; Jensen, Lars Juhl;

    2012-01-01

    included empirically based studies on text mining of electronic patient records (EPRs) that focused on detecting ADRs, excluding those that investigated adverse events not related to medicine use. We extracted information on study populations, EPR data sources, frequencies and types of the identified ADRs......, medicines associated with ADRs, text-mining algorithms used and their performance. Seven studies, all from the United States, were eligible for inclusion in the review. Studies were published from 2001, the majority between 2009 and 2010. Text-mining techniques varied over time from simple free text...... searching of outpatient visit notes and inpatient discharge summaries to more advanced techniques involving natural language processing (NLP) of inpatient discharge summaries. Performance appeared to increase with the use of NLP, although many ADRs were still missed. Due to differences in study design...

  17. Micro-Raman spectroscopy a powerful technique to identify crocidolite and erionite fibers in tissue sections

    Science.gov (United States)

    Rinaudo, C.; Croce, A.; Allegrina, M.; Baris, I. Y.; Dogan, A.; Powers, A.; Rivera, Z.; Bertino, P.; Yang, H.; Gaudino, G.; Carbone, M.

    2013-05-01

    Exposure to mineral fibers such asbestos and erionite is widely associated with the development of lung cancer and pleural malignant mesothelioma (MM). Pedigree and mineralogical studies indicated that genetics may influence mineral fiber carcinogenesis. Although dimensions strongly impact on the fiber carcinogenic potential, also the chemical composition and the fiber is relevant. By using micro-Raman spectroscopy we show here persistence and identification of different mineral phases, directly on histopathological specimens of mice and humans. Fibers of crocidolite asbestos and erionite of different geographic areas (Oregon, US and Cappadocia, Turkey) were injected in mice intra peritoneum. MM developed in 10/15 asbestos-treated mice after 5 months, and in 8-10/15 erionite-treated mice after 14 months. The persistence of the injected fibers was investigated in pancreas, liver, spleen and in the peritoneal tissue. The chemical identification of the different phases occurred in the peritoneal cavity or at the organ borders, while only rarely fibers were localized in the parenchyma. Raman patterns allow easily to recognize crocidolite and erionite fibers. Microscopic analysis revealed that crocidolite fibers were frequently coated by ferruginous material ("asbestos bodies"), whereas erionite fibers were always free from coatings. We also analyzed by micro-Raman spectroscopy lung tissues, both from MM patients of the Cappadocia, where a MM epidemic developed because of environmental exposure to erionite, and from Italian MM patients with occupational exposure to asbestos. Our findings demonstrate that micro-Raman spectroscopy is technique able to identify mineral phases directly on histopathology specimens, as routine tissue sections prepared for diagnostic purpose. REFERENCES A.U. Dogan, M. Dogan. Environ. Geochem. Health 2008, 30(4), 355. M. Carbone, S. Emri, A.U. Dogan, I. Steele, M. Tuncer, HI. Pass, et al. Nat. Rev. Cancer. 2007, 7 (2),147. M. Carbone, Y

  18. Identifying and quantifying energy savings on fired plant using low cost modelling techniques

    International Nuclear Information System (INIS)

    Research highlights: → Furnace models based on the zone method for radiation calculation are described. → Validated steady-state and transient models have been developed. → We show how these simple models can identify the best options for saving energy. → High emissivity coatings predicted to give performance enhancement on a fired heater. → Optimal heat recovery strategies on a steel reheating furnace are predicted. -- Abstract: Combustion in fired heaters, boilers and furnaces often accounts for the major energy consumption on industrial processes. Small improvements in efficiency can result in large reductions in energy consumption, CO2 emissions, and operating costs. This paper will describe some useful low cost modelling techniques based on the zone method to help identify energy saving opportunities on high temperature fuel-fired process plant. The zone method has for many decades, been successfully applied to small batch furnaces through to large steel-reheating furnaces, glass tanks, boilers and fired heaters on petrochemical plant. Zone models can simulate both steady-state furnace operation and more complex transient operation typical of a production environment. These models can be used to predict thermal efficiency and performance, and more importantly, to assist in identifying and predicting energy saving opportunities from such measures as: ·Improving air/fuel ratio and temperature controls. ·Improved insulation. ·Use of oxygen or oxygen enrichment. ·Air preheating via flue gas heat recovery. ·Modification to furnace geometry and hearth loading. There is also increasing interest in the application of refractory coatings for increasing surface radiation in fired plant. All of the techniques can yield savings ranging from a few percent upwards and can deliver rapid financial payback, but their evaluation often requires robust and reliable models in order to increase confidence in making financial investment decisions. This paper gives

  19. BIOELECTRICAL IMPEDANCE VECTOR ANALYSIS IDENTIFIES SARCOPENIA IN NURSING HOME RESIDENTS

    Science.gov (United States)

    Loss of muscle mass and water shifts between body compartments are contributing factors to frailty in the elderly. The body composition changes are especially pronounced in institutionalized elderly. We investigated the ability of single-frequency bioelectrical impedance analysis (BIA) to identify b...

  20. The use of environmental monitoring as a technique to identify isotopic enrichment activities

    International Nuclear Information System (INIS)

    The use of environmental monitoring as a technique to identify activities related to the nuclear fuel cycle has been proposed, by international organizations, as an additional measure to the safeguards agreements in force. The elements specific for each kind of nuclear activity, or nuclear signatures, inserted in the ecosystem by several transfer paths, can be intercepted with better or worse ability by different live organisms. Depending on the kind of signature of interest, the anthropogenic material identification and quantification require the choice of adequate biologic indicators and, mainly, the use of sophisticated techniques associated with elaborate sample treatments. This work demonstrates the technical viability of using pine needles as bioindicators of nuclear signatures associated with uranium enrichment activities. Additionally, it proposes the use of a technique widely diffused nowadays in the scientific community, the High Resolution Inductively Coupled Plasma Mass Spectrometer (HR-ICP-MS), to identify the signature corresponding to that kind of activities in the ecosystem. It can be also found a description of a methodology recently being applied in analytical chemistry,based on uncertainties estimates metrological concepts, used to calculate the uncertainties associated with the obtained measurement results. Nitric acid solutions with a concentration of 0.3 mol.kg-1, used to wash pine needles sampled near facilities that manipulate enriched uranium and containing only 0.1 μg.kg-1 of uranium, exhibit a 235 U: 238 U isotopic abundance ratio of 0.0092±0.0002, while solutions originated from samples collected at places located more than 200 km far from activities related to the nuclear fuel cycle exhibit a value of 0.0074±0.0002 for this abundance ratio. Similar results were obtained for samples collected in different places permit to confirm the presence of anthropogenic uranium and demonstrate the viability of using this technique and the

  1. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  2. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  3. Identifying clinical course patterns in SMS data using cluster analysis

    DEFF Research Database (Denmark)

    Kent, Peter; Kongsted, Alice

    2012-01-01

    whole group, by including all SMS time points in their original form. It was a 'proof of concept' study to explore the potential, clinical relevance, strengths and weakness of such an approach. METHODS: This was a secondary analysis of longitudinal SMS data collected in two randomised controlled trials...... subgroups in the outcomes of research studies. Two previous studies have investigated detailed clinical course patterns in SMS data obtained from people seeking care for low back pain. One used a visual analysis approach and the other performed a cluster analysis of SMS data that had first been transformed...... conducted simultaneously from a single clinical population (n = 322) . Fortnightly SMS data collected over a year on 'days of problematic low back pain' and on 'days of sick leave' were analysed using Two-Step (probabilistic) Cluster Analysis. RESULTS: Clinical course patterns were identified that were...

  4. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  5. Elemental Analysis of Shells by Nuclear Technique

    International Nuclear Information System (INIS)

    Quantitative analysis of strontium(Sr) and calcium(Ca) in fresh water shell and sea shell was studied by X-ray Fluorescence (XRF) technique with Emission-Transmission(E-T) method, using isotope X-ray sources of plutonium-238(Pu-238) and americium-241(Am-241), and comparing with Neutron Activation Analysis technique in TRR-1/M1 reactor. The results show that the calcium content in both types of shells are almost the same, but strontium in sea shell is 3-4 times higher than that in fresh water shell. Moreover, the results can verify the region that used to be the river or ocean. The high ratio of strontium to calcium in many types of the shells from Wat Jaedeehoi, Patumthanee province show the specific character of sea shell.So it can be concluded that this region used to be the ocean in the past

  6. Latent cluster analysis of ALS phenotypes identifies prognostically differing groups.

    Directory of Open Access Journals (Sweden)

    Jeban Ganesalingam

    Full Text Available BACKGROUND: Amyotrophic lateral sclerosis (ALS is a degenerative disease predominantly affecting motor neurons and manifesting as several different phenotypes. Whether these phenotypes correspond to different underlying disease processes is unknown. We used latent cluster analysis to identify groupings of clinical variables in an objective and unbiased way to improve phenotyping for clinical and research purposes. METHODS: Latent class cluster analysis was applied to a large database consisting of 1467 records of people with ALS, using discrete variables which can be readily determined at the first clinic appointment. The model was tested for clinical relevance by survival analysis of the phenotypic groupings using the Kaplan-Meier method. RESULTS: The best model generated five distinct phenotypic classes that strongly predicted survival (p<0.0001. Eight variables were used for the latent class analysis, but a good estimate of the classification could be obtained using just two variables: site of first symptoms (bulbar or limb and time from symptom onset to diagnosis (p<0.00001. CONCLUSION: The five phenotypic classes identified using latent cluster analysis can predict prognosis. They could be used to stratify patients recruited into clinical trials and generating more homogeneous disease groups for genetic, proteomic and risk factor research.

  7. CONSUMER BEHAVIOR ANALYSIS BY GRAPH MINING TECHNIQUE

    OpenAIRE

    KATSUTOSHI YADA; HIROSHI MOTODA; TAKASHI WASHIO; ASUKA MIYAWAKI

    2006-01-01

    In this paper, we discuss how graph mining system is applied to sales transaction data so as to understand consumer behavior. First, existing research of consumer behavior analysis for sequential purchase pattern is reviewed. Then we propose to represent the complicated customer purchase behavior by a directed graph retaining temporal information in a purchase sequence and apply a graph mining technique to analyze the frequent occurring patterns. In this paper, we demonstrate through the case...

  8. Biomechanical Analysis of Contemporary Throwing Technique Theory

    OpenAIRE

    Chen Jian

    2015-01-01

    Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting bas...

  9. Parameter Trajectory Analysis to Identify Treatment Effects of Pharmacological Interventions

    OpenAIRE

    Tiemann, Christian A.; Vanlier, Joep; Oosterveer, Maaike H.; Albert K Groen; Hilbers, Peter A. J.; Natal A W van Riel

    2013-01-01

    The field of medical systems biology aims to advance understanding of molecular mechanisms that drive disease progression and to translate this knowledge into therapies to effectively treat diseases. A challenging task is the investigation of long-term effects of a (pharmacological) treatment, to establish its applicability and to identify potential side effects. We present a new modeling approach, called Analysis of Dynamic Adaptations in Parameter Trajectories (ADAPT), to analyze the long-t...

  10. Three Systems of Insular Functional Connectivity Identified with Cluster Analysis

    OpenAIRE

    Deen, Ben; Pitskel, Naomi B.; Kevin A. Pelphrey

    2010-01-01

    Despite much research on the function of the insular cortex, few studies have investigated functional subdivisions of the insula in humans. The present study used resting-state functional connectivity magnetic resonance imaging (MRI) to parcellate the human insular lobe based on clustering of functional connectivity patterns. Connectivity maps were computed for each voxel in the insula based on resting-state functional MRI (fMRI) data and segregated using cluster analysis. We identified 3 ins...

  11. The application of TXRF analysis technique

    International Nuclear Information System (INIS)

    The total reflection X-ray fluorescence (TXRF) analysis technique is introduced briefly. A small TXRF analyzer characterised by double path X-ray exciting sources and specially short optical path (15 cm) is described. Low minimum detection limit (MDL), e.g. 7 pg for Co element under Cu target tube operating at 20 kV and 6 mA, and 30 pg for Sr under Mo target tube at 46 kV and 10 mA, is achieved. Some analysis experiments for tap water, marine animal and human hair are performed and the results are given

  12. Integrating complementary medicine literacy education into Australian medical curricula: Student-identified techniques and strategies for implementation.

    Science.gov (United States)

    Templeman, Kate; Robinson, Anske; McKenna, Lisa

    2015-11-01

    Formal medical education about complementary medicine (CM) that comprises medicinal products/treatments is required due to possible CM interactions with conventional medicines; however, few guidelines exist on design and implementation of such education. This paper reports findings of a constructivist grounded theory method study that identified key strategies for integrating CM literacy education into medical curricula. Analysis of data from interviews with 30 medical students showed that students supported a longitudinal integrative and pluralistic approach to medicine. Awareness of common patient use, evidence, and information relevant to future clinical practice were identified as focus points needed for CM literacy education. Students advocated for interactive case-based, experiential and dialogical didactic techniques that are multiprofessional and student-centred. Suggested strategies provide key elements of CM literacy within research, field-based practice, and didactic teaching over the entirety of the curriculum. CM educational strategies should address CM knowledge deficits and ultimately respond to patients' needs. PMID:26573450

  13. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  14. Identifying Innovative Interventions to Promote Healthy Eating Using Consumption-Oriented Food Supply Chain Analysis

    OpenAIRE

    Hawkes, Corinna, ed.

    2009-01-01

    The mapping and analysis of supply chains is a technique increasingly used to address problems in the food system. Yet such supply chain management has not yet been applied as a means of encouraging healthier diets. Moreover, most policies recommended to promote healthy eating focus on the consumer end of the chain. This article proposes a consumption-oriented food supply chain analysis to identify the changes needed in the food supply chain to create a healthier food environment, measured in...

  15. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  16. Multispectral and Photoplethysmography Optical Imaging Techniques Identify Important Tissue Characteristics in an Animal Model of Tangential Burn Excision.

    Science.gov (United States)

    Thatcher, Jeffrey E; Li, Weizhi; Rodriguez-Vaqueiro, Yolanda; Squiers, John J; Mo, Weirong; Lu, Yang; Plant, Kevin D; Sellke, Eric; King, Darlene R; Fan, Wensheng; Martinez-Lorenzo, Jose A; DiMaio, J Michael

    2016-01-01

    Burn excision, a difficult technique owing to the training required to identify the extent and depth of injury, will benefit from a tool that can cue the surgeon as to where and how much to resect. We explored two rapid and noninvasive optical imaging techniques in their ability to identify burn tissue from the viable wound bed using an animal model of tangential burn excision. Photoplethysmography (PPG) imaging and multispectral imaging (MSI) were used to image the initial, intermediate, and final stages of burn excision of a deep partial-thickness burn. PPG imaging maps blood flow in the skin's microcirculation, and MSI collects the tissue reflectance spectrum in visible and infrared wavelengths of light to classify tissue based on a reference library. A porcine deep partial-thickness burn model was generated and serial tangential excision accomplished with an electric dermatome set to 1.0 mm depth. Excised eschar was stained with hematoxylin and eosin to determine the extent of burn remaining at each excision depth. We confirmed that the PPG imaging device showed significantly less blood flow where burn tissue was present, and the MSI method could delineate burn tissue in the wound bed from the viable wound bed. These results were confirmed independently by a histological analysis. We found these devices can identify the proper depth of excision, and their images could cue a surgeon as to the preparedness of the wound bed for grafting. These image outputs are expected to facilitate clinical judgment in the operating room. PMID:26594863

  17. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  18. VLF radio propagation conditions. Computational analysis techniques

    International Nuclear Information System (INIS)

    Complete text of publication follows. Very low frequency (VLF) radio waves propagate within the Earth-ionosphere waveguide with very little attenuation. Modifications of the waveguide geometry effect the propagation conditions, and hence, the attenuation. Changes in the ionosphere, such as the presence of the D-region during the day, or the precipitation of energetic particles, are the main causes of this modification. Using narrowband receivers monitoring VLF transmitters, the amplitude and phase of these signals are recorded. Multivariate data analysis techniques, namely Principal Component Analysis (PCA) and Singular Spectrum Analysis (SSA), are applied to the data in order to determine parameters, such as seasonal and diurnal changes, affecting the variation of these signals. Transient effects may then be easier to detect.

  19. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  20. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  1. Forensic Analysis using Geological and Geochemical Techniques

    Science.gov (United States)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  2. Clustering Analysis within Text Classification Techniques

    Directory of Open Access Journals (Sweden)

    Madalina ZURINI

    2011-01-01

    Full Text Available The paper represents a personal approach upon the main applications of classification which are presented in the area of knowledge based society by means of methods and techniques widely spread in the literature. Text classification is underlined in chapter two where the main techniques used are described, along with an integrated taxonomy. The transition is made through the concept of spatial representation. Having the elementary elements of geometry and the artificial intelligence analysis, spatial representation models are presented. Using a parallel approach, spatial dimension is introduced in the process of classification. The main clustering methods are described in an aggregated taxonomy. For an example, spam and ham words are clustered and spatial represented, when the concepts of spam, ham and common and linkage word are presented and explained in the xOy space representation.

  3. Rice Transcriptome Analysis to Identify Possible Herbicide Quinclorac Detoxification Genes

    Directory of Open Access Journals (Sweden)

    Wenying eXu

    2015-09-01

    Full Text Available Quinclorac is a highly selective auxin-type herbicide, and is widely used in the effective control of barnyard grass in paddy rice fields, improving the world’s rice yield. The herbicide mode of action of quinclorac has been proposed and hormone interactions affect quinclorac signaling. Because of widespread use, quinclorac may be transported outside rice fields with the drainage waters, leading to soil and water pollution and environmental health problems.In this study, we used 57K Affymetrix rice whole-genome array to identify quinclorac signaling response genes to study the molecular mechanisms of action and detoxification of quinclorac in rice plants. Overall, 637 probe sets were identified with differential expression levels under either 6 or 24 h of quinclorac treatment. Auxin-related genes such as GH3 and OsIAAs responded to quinclorac treatment. Gene Ontology analysis showed that genes of detoxification-related family genes were significantly enriched, including cytochrome P450, GST, UGT, and ABC and drug transporter genes. Moreover, real-time RT-PCR analysis showed that top candidate P450 families such as CYP81, CYP709C and CYP72A genes were universally induced by different herbicides. Some Arabidopsis genes for the same P450 family were up-regulated under quinclorac treatment.We conduct rice whole-genome GeneChip analysis and the first global identification of quinclorac response genes. This work may provide potential markers for detoxification of quinclorac and biomonitors of environmental chemical pollution.

  4. Towards a Methodology for Identifying Program Constraints During Requirements Analysis

    Science.gov (United States)

    Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo

    1997-01-01

    Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.

  5. Parameter trajectory analysis to identify treatment effects of pharmacological interventions.

    Directory of Open Access Journals (Sweden)

    Christian A Tiemann

    Full Text Available The field of medical systems biology aims to advance understanding of molecular mechanisms that drive disease progression and to translate this knowledge into therapies to effectively treat diseases. A challenging task is the investigation of long-term effects of a (pharmacological treatment, to establish its applicability and to identify potential side effects. We present a new modeling approach, called Analysis of Dynamic Adaptations in Parameter Trajectories (ADAPT, to analyze the long-term effects of a pharmacological intervention. A concept of time-dependent evolution of model parameters is introduced to study the dynamics of molecular adaptations. The progression of these adaptations is predicted by identifying necessary dynamic changes in the model parameters to describe the transition between experimental data obtained during different stages of the treatment. The trajectories provide insight in the affected underlying biological systems and identify the molecular events that should be studied in more detail to unravel the mechanistic basis of treatment outcome. Modulating effects caused by interactions with the proteome and transcriptome levels, which are often less well understood, can be captured by the time-dependent descriptions of the parameters. ADAPT was employed to identify metabolic adaptations induced upon pharmacological activation of the liver X receptor (LXR, a potential drug target to treat or prevent atherosclerosis. The trajectories were investigated to study the cascade of adaptations. This provided a counter-intuitive insight concerning the function of scavenger receptor class B1 (SR-B1, a receptor that facilitates the hepatic uptake of cholesterol. Although activation of LXR promotes cholesterol efflux and -excretion, our computational analysis showed that the hepatic capacity to clear cholesterol was reduced upon prolonged treatment. This prediction was confirmed experimentally by immunoblotting measurements of SR-B1

  6. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  7. Vibration analysis techniques for rotating machineries

    International Nuclear Information System (INIS)

    In this modem era, lives and standard of living are closely linked with machines. Today higher quality, reliability and operational safety in a machine in addition the demand for its long service life are expected. To meet these demands, one requires the knowledge of its dynamic behavior. This can be achieved by employing predictive maintenance strategy with regular condition inspection and early fault recognition in case of damage. Machine diagnosis by vibration analysis technique offers cost effective and reliable method of condition evaluation. The causes can be located and corrective measures can be planned long before severe, direct and consequential damage/breakdown can occur. Although this subject is multifarious, this paper provides some assistance in understanding the basics of vibration measurement, analysis and diagnosis of faults in rotating machinery. Machinery diagnosis through vibration measurement is an integral part of in-service inspection programme practiced in nuclear power plants and research reactors

  8. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  9. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    International Nuclear Information System (INIS)

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  10. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhof, Hendrik, E-mail: Hendrik.Kohlhof@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Heidt, Christoph, E-mail: Christoph.heidt@kispi.uzh.ch [Department of Orthopedic Surgery, University Children' s Hospital Zurich, Steinwiesstrasse 74, 8032 Switzerland (Switzerland); Bähler, Alexandrine, E-mail: Alexandrine.baehler@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Kohl, Sandro, E-mail: sandro.kohl@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Gravius, Sascha, E-mail: sascha.gravius@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Friedrich, Max J., E-mail: Max.Friedrich@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Ziebarth, Kai, E-mail: kai.ziebarth@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Stranzinger, Enno, E-mail: Enno.Stranzinger@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland)

    2015-06-15

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  11. Longitudinal Metagenomic Analysis of Hospital Air Identifies Clinically Relevant Microbes

    Science.gov (United States)

    King, Paula; Pham, Long K.; Waltz, Shannon; Sphar, Dan; Yamamoto, Robert T.; Conrad, Douglas; Taplitz, Randy; Torriani, Francesca

    2016-01-01

    We describe the sampling of sixty-three uncultured hospital air samples collected over a six-month period and analysis using shotgun metagenomic sequencing. Our primary goals were to determine the longitudinal metagenomic variability of this environment, identify and characterize genomes of potential pathogens and determine whether they are atypical to the hospital airborne metagenome. Air samples were collected from eight locations which included patient wards, the main lobby and outside. The resulting DNA libraries produced 972 million sequences representing 51 gigabases. Hierarchical clustering of samples by the most abundant 50 microbial orders generated three major nodes which primarily clustered by type of location. Because the indoor locations were longitudinally consistent, episodic relative increases in microbial genomic signatures related to the opportunistic pathogens Aspergillus, Penicillium and Stenotrophomonas were identified as outliers at specific locations. Further analysis of microbial reads specific for Stenotrophomonas maltophilia indicated homology to a sequenced multi-drug resistant clinical strain and we observed broad sequence coverage of resistance genes. We demonstrate that a shotgun metagenomic sequencing approach can be used to characterize the resistance determinants of pathogen genomes that are uncharacteristic for an otherwise consistent hospital air microbial metagenomic profile. PMID:27482891

  12. Cluster analysis of clinical data identifies fibromyalgia subgroups.

    Directory of Open Access Journals (Sweden)

    Elisa Docampo

    Full Text Available INTRODUCTION: Fibromyalgia (FM is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. MATERIAL AND METHODS: 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. RESULTS: VARIABLES CLUSTERED INTO THREE INDEPENDENT DIMENSIONS: "symptomatology", "comorbidities" and "clinical scales". Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1, high symptomatology and comorbidities (Cluster 2, and high symptomatology but low comorbidities (Cluster 3, showing differences in measures of disease severity. CONCLUSIONS: We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment.

  13. A cardiod based technique to identify cardiovascular diseases using mobile phones and body sensors.

    Science.gov (United States)

    Sufi, Fahim; Khalil, Ibrahim; Tari, Zahir

    2010-01-01

    To prevent the threat of Cardiovascular Disease (CVD) related deaths, the usage of mobile phone based computational platforms, body sensors and wireless communications is proliferating. Since mobile phones have limited computational resources, existing PC based complex CVD detection algorithms are often unsuitable for wireless telecardiology applications. Moreover, if the existing Electrocardiography (ECG) based CVD detection algorithms are adopted for mobile telecardiology applications, then there will be processing delays due to the computational complexities of the existing algorithms. However, for a CVD affected patient, seconds worth of delay could be fatal, since cardiovascular cell damage is a totally irrecoverable process. This paper proposes a fast and efficient mechanism of CVD detection from ECG signal. Unlike the existing ECG based CVD diagnosis systems that detect CVD anomalies from hundreds of sample points, the proposed mechanism identifies cardiac abnormality from only 5 sample points. Therefore, according to our experiments the proposed mechanism is up to 3 times faster than the existing techniques. Due to less computational burden, the proposed mechanism is ideal for wireless telecardiology applications running on mobile phones. PMID:21096293

  14. Using clustering techniques to identify localities with multiple health and social needs.

    Science.gov (United States)

    Bellis, Mark A; Jarman, Ian; Downing, Jenny; Perkins, Clare; Beynon, Caryl; Hughes, Karen; Lisboa, Paulo

    2012-03-01

    Development of health promoting policies requires an understanding not just of the interplay between different measures of health but also their relationship with broader education, criminal justice and other social issues. Methods to better utilise multi-sectoral data to inform policy are needed. Applying clustering techniques to 30 health and social metrics we identify 5 distinct local authority types, with poor outcomes for the majority of metrics concentrated in the same cluster. Clusters were distinguished especially by levels of: child poverty; breastfeeding initiation; children's tooth decay; teenage pregnancy; healthy eating; mental illness; tuberculosis and smoking deaths. Membership of the worst cluster (C5) was focused in Northern England which contains 15.7% of authorities analysed (n=324), but 63.0% of those in C5. The concentration of challenges in certain areas creates disproportionate pressures that may exceed the cumulative effects of individual challenges. Such distinct health clusters also raise issues of transferability of effective policies between areas with different cluster membership. PMID:21925923

  15. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  16. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  17. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  18. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  19. Analysis of an Image Secret Sharing Scheme to Identify Cheaters

    Directory of Open Access Journals (Sweden)

    Jung-San LEe

    2010-09-01

    Full Text Available Secret image sharing mechanisms have been widely applied to the military, e-commerce, and communications fields. Zhao et al. introduced the concept of cheater detection into image sharing schemes recently. This functionality enables the image owner and authorized members to identify the cheater in reconstructing the secret image. Here, we provide an analysis of Zhao et al.¡¦s method: an authorized participant is able to restore the secret image by him/herself. This contradicts the requirement of secret image sharing schemes. The authorized participant utilizes an exhaustive search to achieve the attempt, though, simulation results show that it can be done within a reasonable time period.

  20. Social network analysis in identifying influential webloggers: A preliminary study

    Science.gov (United States)

    Hasmuni, Noraini; Sulaiman, Nor Intan Saniah; Zaibidi, Nerda Zura

    2014-12-01

    In recent years, second generation of internet-based services such as weblog has become an effective communication tool to publish information on the Web. Weblogs have unique characteristics that deserve users' attention. Some of webloggers have seen weblogs as appropriate medium to initiate and expand business. These webloggers or also known as direct profit-oriented webloggers (DPOWs) communicate and share knowledge with each other through social interaction. However, survivability is the main issue among DPOW. Frequent communication with influential webloggers is one of the way to keep survive as DPOW. This paper aims to understand the network structure and identify influential webloggers within the network. Proper understanding of the network structure can assist us in knowing how the information is exchanged among members and enhance survivability among DPOW. 30 DPOW were involved in this study. Degree centrality and betweenness centrality measurement in Social Network Analysis (SNA) were used to examine the strength relation and identify influential webloggers within the network. Thus, webloggers with the highest value of these measurements are considered as the most influential webloggers in the network.

  1. Use of electron spin resonance technique for identifying of irradiated foods

    International Nuclear Information System (INIS)

    The present investigation was carried out to establish the electron spin resonance (ESR) technique for identifying of some irradiated foodstuffs, i.e. dried fruits (fig and raisin), nuts (almond and pistachio) and spices (fennel and thyme). Gamma rays were used as follows: 0, 1, 3 and 5 kGy were given for dried fruits, while 0, 2, 4 and 6 kGy were given for nuts. In addition, 0, 5, 10 and 15 kGy were given for spices. All treatments were stored at room temperature (25±2 degree C) for six months to study the possibility of detecting its irradiation treatment by ESR spectroscopy. The obtained results indicated that ESR signal intensities of all irradiated samples were markedly increased correspondingly with irradiation dose as a result of free radicals generated by gamma irradiation. So, all irradiated samples under investigation could be differentiated from unirradiated ones immediately after irradiation treatment. The decay that occur in free radicals which responsible of ESR signals during storage periods at ambient temperature showed a significant minimize in ESR signal intensities of irradiated samples. Therefore, after six months of ambient storage the detection was easily possible for irradiated dried fig with dose ≥ 3 kGy and for all irradiated raisin and pistachio (shell). Also, it was possible for irradiated fennel with dose ≥ 10 kGy and for irradiated thyme with dose ≥15 kGy. In contrast, the identification of all irradiated samples of almond (shell as well as edible part) and pistachio (edible part) was impossible after six months of ambient storage.

  2. Liver Ultrasound Image Analysis using Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    Smriti Sahu, Maheedhar Dubey, Mohammad Imroze Khan

    2012-12-01

    Full Text Available Liver cancer is the sixth most common malignanttumour and the third most common cause ofcancer-related deaths worldwide. Chronic Liverdamage affects up to 20% of our population. It hasmany causes - viral infections (Hepatitis B and C,toxins, genetic, metabolic and autoimmune diseases.The rate of liver cancer in Australia has increasedfour-fold in the past 20 years. For detection andqualitative diagnosis of liver diseases, Ultrasound(US image is an easy-to-use and minimally invasiveimaging modality. Medical images are oftendeteriorated by noise due to various sources ofinterferences and other phenomena known asSpeckle noise. Therefore it is required to apply somedigital image processing techniques for smoothingor suppression of speckle noise in ultrasoundimages. This paper attempts to undertake the studythree types of the image enhancement techniquesincluding, Shock Filter, Contrast Limited AdaptiveHistogram Equalization (CLAHE and Spatialfilter. These smoothing techniques are comparedusing performance matrices Peak Signal to NoiseRatio (PSNR and Mean Square Error (MSE. Ithas been observed that the Spatial high pass filtergives the better performance than others for liverultrasound image analysis.

  3. Advanced analysis techniques for uranium assay

    International Nuclear Information System (INIS)

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  4. Performance Analysis of Acoustic Echo Cancellation Techniques

    Directory of Open Access Journals (Sweden)

    Rajeshwar Dass

    2014-07-01

    Full Text Available Mainly, the adaptive filters are implemented in time domain which works efficiently in most of the applications. But in many applications the impulse response becomes too large, which increases the complexity of the adaptive filter beyond a level where it can no longer be implemented efficiently in time domain. An example of where this can happen would be acoustic echo cancellation (AEC applications. So, there exists an alternative solution i.e. to implement the filters in frequency domain. AEC has so many applications in wide variety of problems in industrial operations, manufacturing and consumer products. Here in this paper, a comparative analysis of different acoustic echo cancellation techniques i.e. Frequency domain adaptive filter (FDAF, Least mean square (LMS, Normalized least mean square (NLMS &Sign error (SE is presented. The results are compared with different values of step sizes and the performance of these techniques is measured in terms of Error rate loss enhancement (ERLE, Mean square error (MSE& Peak signal to noise ratio (PSNR.

  5. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters (125I, 57Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  6. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  7. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  8. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  9. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  10. Identifying a preservation zone using multi–criteria decision analysis

    Directory of Open Access Journals (Sweden)

    Farashi, A.

    2016-03-01

    Full Text Available Zoning of a protected area is an approach to partition landscape into various land use units. The management of these landscape units can reduce conflicts caused by human activities. Tandoreh National Park is one of the most biologically diverse, protected areas in Iran. Although the area is generally designed to protect biodiversity, there are many conflicts between biodiversity conservation and human activities. For instance, the area is highly controversial and has been considered as an impediment to local economic development, such as tourism, grazing, road construction, and cultivation. In order to reduce human conflicts with biodiversity conservation in Tandoreh National Park, safe zones need to be established and human activities need to be moved out of the zones. In this study we used a systematic methodology to integrate a participatory process with Geographic Information Systems (GIS using a multi–criteria decision analysis (MCDA technique to guide a zoning scheme for the Tandoreh National Park, Iran. Our results show that the northern and eastern parts of the Tandoreh National Park that were close to rural areas and farmlands returned less desirability for selection as a preservation area. Rocky Mountains were the most important and most destructed areas and abandoned plains were the least important criteria for preservation in the area. Furthermore, the results reveal that the land properties were considered to be important for protection based on the obtaine

  11. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  12. Metal trace analysis by PIXE and PDMS techniques

    International Nuclear Information System (INIS)

    The risk for the human health due to exposure to aerosols depends on the intake pattern, the mass concentration and the speciation of the elements present in airborne particles. In this work plasma desorption mass spectrometry (PDMS) was used as complementary technique to the particle-induced X-ray emission (PIXE) technique to characterize aerosol samples collected in the environment. The PIXE technique allows the identification of the elements present in the sample and to determine their mass concentrations. The mass spectrometry (PDMS) was used to identify the speciation of these elements present in the samples. The aerosol samples were collected using a 6-stage cascade impactor (CI) in two sites of Rio de Janeiro City. One is an island (Fundao Island) in the Guanabara Bay close to an industrial zone and the other, in Gavea, is a residential zone close to a lagoon and to the seashore. The mass median aerodynamic diameter (MMAD) measured indicated that the airborne particulates were in the fine fraction of the aerosols collected in both locations. In order to identify the contribution of the seawater particles from the Guanabara Bay in the aerosols, seawater samples were also collected at Fundao Island. The samples were analyzed by PIXE and PDMS techniques. The analysis of the results suggests that the aerosols are different in both sampling sites and also exist a contribution from the Guanabara Bay seawater particles to the aerosols collected in the Fundao Island. PIXE allows identification and quantification of the elements heavier than Na (Z=11) while PDMS allows identification of organic and inorganic compounds present in the samples, as these techniques are used as complementary techniques they provide important information about the aerosols characterization

  13. Directional reflectance analysis for identifying counterfeit drugs: Preliminary study.

    Science.gov (United States)

    Wilczyński, Sławomir; Koprowski, Robert; Błońska-Fajfrowska, Barbara

    2016-05-30

    The WHO estimates that up to 10% of drugs on the market may be counterfeit. In order to prevent intensification of the phenomenon of drug counterfeiting, the methods for distinguishing genuine medicines from fake ones need to be developed. The aim of this study was to try to develop simple, reproducible and inexpensive method for distinguishing between original and counterfeit medicines based on the measurement of directional reflectance. The directional reflectance of 6 original Viagra(®) tablets (Pfizer) and 24 (4 different batches) counterfeit tablets (imitating Viagra(®)) was examined in six spectral bands: from 0.9 to 1.1 μm, from 1.9 to 2.6 μm, from 3.0 to 4.0 μm, from 3.0 to 5.0 μm, from 4.0 to 5.0 μm, from 8.0 to 12.0 μm, and for two angles of incidence, 20° and 60°. Directional hemispherical reflectometer was applied to measure directional reflectance. Significant statistical differences between the directional reflectance of the original Viagra(®) and counterfeit tablets were registered. Any difference in the value of directional reflectance for any spectral band or angle of incidence identifies the drug as a fake one. The proposed method of directional reflectance analysis enables to differentiate between the real Viagra(®) and fake tablets. Directional reflectance analysis is a fast (measurement time under 5s), cheap and reproducible method which does not require expensive equipment or specialized laboratory staff. It also seems to be an effective method, however, the effectiveness will be assessed after the extension of research. PMID:26977587

  14. Global secretome analysis identifies novel mediators of bone metastasis

    Institute of Scientific and Technical Information of China (English)

    Mario Andres Blanco; Gary LeRoy; Zia Khan; Ma(s)a Ale(c)kovi(c); Barry M Zee; Benjamin A Garcia; Yibin Kang

    2012-01-01

    Bone is the one of the most common sites of distant metastasis of solid tumors.Secreted proteins are known to influence pathological interactions between metastatic cancer cells and the bone stroma.To comprehensively profile secreted proteins associated with bone metastasis,we used quantitative and non-quantitative mass spectrometry to globally analyze the secretomes of nine cell lines of varying bone metastatic ability from multiple species and cancer types.By comparing the secretomes of parental cells and their bone metastatic derivatives,we identified the secreted proteins that were uniquely associated with bone metastasis in these cell lines.We then incorporated bioinformatic analyses of large clinical metastasis datasets to obtain a list of candidate novel bone metastasis proteins of several functional classes that were strongly associated with both clinical and experimental bone metastasis.Functional validation of selected proteins indicated that in vivo bone metastasis can be promoted by high expression of (1) the salivary cystatins CST1,CST2,and CST4; (2) the plasminogen activators PLAT and PLAU; or (3) the collagen functionality proteins PLOD2 and COL6A1.Overall,our study has uncovered several new secreted mediators of bone metastasis and therefore demonstrated that secretome analysis is a powerful method for identification of novel biomarkers and candidate therapeutic targets.

  15. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  16. Global secretome analysis identifies novel mediators of bone metastasis.

    Science.gov (United States)

    Blanco, Mario Andres; LeRoy, Gary; Khan, Zia; Alečković, Maša; Zee, Barry M; Garcia, Benjamin A; Kang, Yibin

    2012-09-01

    Bone is the one of the most common sites of distant metastasis of solid tumors. Secreted proteins are known to influence pathological interactions between metastatic cancer cells and the bone stroma. To comprehensively profile secreted proteins associated with bone metastasis, we used quantitative and non-quantitative mass spectrometry to globally analyze the secretomes of nine cell lines of varying bone metastatic ability from multiple species and cancer types. By comparing the secretomes of parental cells and their bone metastatic derivatives, we identified the secreted proteins that were uniquely associated with bone metastasis in these cell lines. We then incorporated bioinformatic analyses of large clinical metastasis datasets to obtain a list of candidate novel bone metastasis proteins of several functional classes that were strongly associated with both clinical and experimental bone metastasis. Functional validation of selected proteins indicated that in vivo bone metastasis can be promoted by high expression of (1) the salivary cystatins CST1, CST2, and CST4; (2) the plasminogen activators PLAT and PLAU; or (3) the collagen functionality proteins PLOD2 and COL6A1. Overall, our study has uncovered several new secreted mediators of bone metastasis and therefore demonstrated that secretome analysis is a powerful method for identification of novel biomarkers and candidate therapeutic targets. PMID:22688892

  17. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  18. A Probabilistic Analysis of Marker-Passing Techniques for Plan-Recognition

    OpenAIRE

    Carroll, Glenn; Charniak, Eugene

    2013-01-01

    Useless paths are a chronic problem for marker-passing techniques. We use a probabilistic analysis to justify a method for quickly identifying and rejecting useless paths. Using the same analysis, we identify key conditions and assumptions necessary for marker-passing to perform well.

  19. Application of Spectral Change Detection Techniques to Identify Forest Harvesting Using Landsat TM Data

    OpenAIRE

    Chambers, Samuel David

    2002-01-01

    The main objective of this study was to determine the spectral change technique best suited to detect complete forest harvests (clearcuts) in the Southern United States. In the pursuit of this objective eight existing change detection techniques were quantitatively evaluated and a hybrid method was also developed. Secondary objectives were to determine the impact of atmospheric corrections applied before the change detection, and the affect post-processing methods to eliminate small groups ...

  20. Ion beam analysis techniques applied to large scale pollution studies

    International Nuclear Information System (INIS)

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 μm particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs

  1. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  2. Real-time analysis application for identifying bursty local areas related to emergency topics.

    Science.gov (United States)

    Sakai, Tatsuhiro; Tamura, Keiichi

    2015-01-01

    Since social media started getting more attention from users on the Internet, social media has been one of the most important information source in the world. Especially, with the increasing popularity of social media, data posted on social media sites are rapidly becoming collective intelligence, which is a term used to refer to new media that is displacing traditional media. In this paper, we focus on geotagged tweets on the Twitter site. These geotagged tweets are referred to as georeferenced documents because they include not only a short text message, but also the documents' posting time and location. Many researchers have been tackling the development of new data mining techniques for georeferenced documents to identify and analyze emergency topics, such as natural disasters, weather, diseases, and other incidents. In particular, the utilization of geotagged tweets to identify and analyze natural disasters has received much attention from administrative agencies recently because some case studies have achieved compelling results. In this paper, we propose a novel real-time analysis application for identifying bursty local areas related to emergency topics. The aim of our new application is to provide new platforms that can identify and analyze the localities of emergency topics. The proposed application is composed of three core computational intelligence techniques: the Naive Bayes classifier technique, the spatiotemporal clustering technique, and the burst detection technique. Moreover, we have implemented two types of application interface: a Web application interface and an android application interface. To evaluate the proposed application, we have implemented a real-time weather observation system embedded the proposed application. we used actual crawling geotagged tweets posted on the Twitter site. The weather observation system successfully detected bursty local areas related to observed emergency weather topics. PMID:25918679

  3. Identifying Phytoplankton Classes In California Reservoirs Using HPLC Pigment Analysis

    Science.gov (United States)

    Siddiqui, S.; Peacock, M. B.; Kudela, R. M.; Negrey, K.

    2014-12-01

    Few bodies of water are routinely monitored for phytoplankton composition due to monetary and time constraints, especially the less accessible bodies of water in central and southern California. These lakes and estuaries are important for economic reasons such as tourism and fishing. This project investigated the composition of phytoplankton present using pigment analysis to identify dominant phytoplankton groups. A total of 28 different sites with a wide range of salinity (0 - 60) in central and southern California were examined. These included 13 different bodies of water in central California: 6 in the Sierras, 7 in the San Francisco Bay Estuary, and 15 from southern California. The samples were analyzed using high-performance liquid-chromatography (HPLC) to quantify the pigments present (using retention time and the spectral thumbprint). Diagnostic pigments were used to indicate the phytoplankton class composition, focusing on diatoms, dinoflagellates, cryptophytes, and cyanobacteria - all key phytoplankton groups indicative of the health of the sampled reservoir. Our results indicated that cyanobacteria dominated four of the seven bodies of central California water (Mono Lake, Bridgeport Reservoir, Steamboat Slough, and Pinto Lake); cryptophytes and nannoflagellates dominated two of the central California bodies of water (Mare Island Strait and Topaz Lake); and diatoms and dinoflagellates dominated one central California body of water, Oakland Inner Harbor, comprising more than 70% of the phytoplankton present. We expect the bodies of water from Southern California to be as disparate. Though this data is only a snapshot, it has significant implications in comparing different ecosystems across California, and it has the potential to provide valuable insight into the composition of phytoplankton communities.

  4. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  5. Can Passive Mobile Application Traffic be Identified using Machine Learning Techniques

    OpenAIRE

    Holland, Peter

    2015-01-01

    Mobile phone applications (apps) can generate background traffic when the end-user is not actively using the app. If this background traffic could be accurately identified, network operators could de-prioritise this traffic and free up network bandwidth for priority network traffic. The background app traffic should have IP packet features that could be utilised by a machine learning algorithm to identify app-generated (passive) traffic as opposed to user-generated (active) traffic. Previous ...

  6. Modern Analysis Techniques for Spectroscopic Binaries

    CERN Document Server

    Hensberge, H

    2006-01-01

    Techniques to extract information from spectra of unresolved multi-component systems are revised, with emphasis on recent developments and practical aspects. We review the cross-correlation techniques developed to deal with such spectra, discuss the determination of the broadening function and compare techniques to reconstruct component spectra. The recent results obtained by separating or disentangling the component spectra is summarized. An evaluation is made of possible indeterminacies and random and systematic errors in the component spectra.

  7. Intelligent Technique for Signal Processing to Identify the Brain Disorder for Epilepsy Captures Using Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Gurumurthy Sasikumar

    2016-01-01

    Full Text Available The new direction of understand the signal that is created from the brain organization is one of the main chores in the brain signal processing. Amid all the neurological disorders the human brain epilepsy is measured as one of the extreme prevalent and then programmed artificial intelligence detection technique is an essential due to the crooked and unpredictable nature of happening of epileptic seizures. We proposed an Improved Fuzzy firefly algorithm, which would enhance the classification of the brain signal efficiently with minimum iteration. An important bunching technique created on fuzzy logic is the Fuzzy C means. Together in the feature domain with the spatial domain the features gained after multichannel EEG signals remained combined by means of fuzzy algorithms. And for better precision segmentation process the firefly algorithm is applied to optimize the Fuzzy C-means membership function. Simultaneously for the efficient clustering method the convergence criteria are set. On the whole the proposed technique yields more accurate results and that gives an edge over other techniques. This proposed algorithm result compared with other algorithms like fuzzy c means algorithm and PSO algorithm.

  8. Useful intensity: A technique to identify radiating regions on arbitrarily shaped surfaces

    Science.gov (United States)

    Corrêa Junior, C. A.; Tenenbaum, R. A.

    2013-03-01

    This work presents a new technique for the computation of the numerical equivalent to the supersonic acoustic intensity, for arbitrarily shaped sound sources. The technique provides therefore the identification of the regions of a noise source that effectively contribute to the sound power radiated into the far field by filtering non-propagating sound waves. The proposed technique is entirely formulated on the vibrating surface. The radiated acoustic power is obtained through a numerical operator that relates it with the superficial normal velocity distribution. The power operator is obtained by using the boundary element method. Such operator, possesses the property of being Hermitian. The advantage of this characteristic is that it has real eigenvalues and their eigenvectors form an orthonormal set for the velocity distribution. It is applied to the power operator the decomposition in eigenvalues and eigenvectors, becoming possible to compute the numerical equivalent to the supersonic intensity, called here as useful intensity, after applying a cutoff criterion to remove the non-propagating components. Some numerical tests confirming the effectiveness of the convergence criterion are presented. Examples of the application of the useful intensity technique in vibrating surfaces such as a plate, a cylinder with flat caps and an automotive muffler are presented and the numerical results are discussed.

  9. Techniques for identifying cross-disciplinary and 'hard-to-detect' evidence for systematic review.

    Science.gov (United States)

    O'Mara-Eves, Alison; Brunton, Ginny; McDaid, David; Kavanagh, Josephine; Oliver, Sandy; Thomas, James

    2014-03-01

    Driven by necessity in our own complex review, we developed alternative systematic ways of identifying relevant evidence where the key concepts are generally not focal to the primary studies' aims and are found across multiple disciplines-that is, hard-to-detect evidence. Specifically, we sought to identify evidence on community engagement in public health interventions that aim to reduce health inequalities. Our initial search strategy used text mining to identify synonyms for the concept 'community engagement'. We conducted a systematic search for reviews on public health interventions, supplemented by searches of trials databases. We then used information in the reviews' evidence tables to gather more information about the included studies than was evident in the primary studies' own titles or abstracts. We identified 319 primary studies cited in reviews after full-text screening. In this paper, we retrospectively reflect on the challenges and benefits of the approach taken. We estimate that more than a quarter of the studies that were identified would have been missed by typical searching and screening methods. This identification strategy was highly effective and could be useful for reviews of broad research questions, or where the key concepts are unlikely to be the main focus of primary research. PMID:26054025

  10. Identifying the "Right Stuff": An Exploration-Focused Astronaut Job Analysis

    Science.gov (United States)

    Barrett, J. D.; Holland, A. W.; Vessey, W. B.

    2015-01-01

    Industrial and organizational (I/O) psychologists play a key role in NASA astronaut candidate selection through the identification of the competencies necessary to successfully engage in the astronaut job. A set of psychosocial competencies, developed by I/O psychologists during a prior job analysis conducted in 1996 and updated in 2003, were identified as necessary for individuals working and living in the space shuttle and on the International Space Station (ISS). This set of competencies applied to the space shuttle and applies to current ISS missions, but may not apply to longer-duration or long-distance exploration missions. With the 2015 launch of the first 12- month ISS mission and the shift in the 2020s to missions beyond low earth orbit, the type of missions that astronauts will conduct and the environment in which they do their work will change dramatically, leading to new challenges for these crews. To support future astronaut selection, training, and research, I/O psychologists in NASA's Behavioral Health and Performance (BHP) Operations and Research groups engaged in a joint effort to conduct an updated analysis of the astronaut job for current and future operations. This project will result in the identification of behavioral competencies critical to performing the astronaut job, along with relative weights for each of the identified competencies, through the application of job analysis techniques. While this job analysis is being conducted according to job analysis best practices, the project poses a number of novel challenges. These challenges include the need to identify competencies for multiple mission types simultaneously, to evaluate jobs that have no incumbents as they have never before been conducted, and working with a very limited population of subject matter experts. Given these challenges, under the guidance of job analysis experts, we used the following methods to conduct the job analysis and identify the key competencies for current and

  11. Integrating subpathway analysis to identify candidate agents for hepatocellular carcinoma.

    Science.gov (United States)

    Wang, Jiye; Li, Mi; Wang, Yun; Liu, Xiaoping

    2016-01-01

    Hepatocellular carcinoma (HCC) is the second most common cause of cancer-associated death worldwide, characterized by a high invasiveness and resistance to normal anticancer treatments. The need to develop new therapeutic agents for HCC is urgent. Here, we developed a bioinformatics method to identify potential novel drugs for HCC by integrating HCC-related and drug-affected subpathways. By using the RNA-seq data from the TCGA (The Cancer Genome Atlas) database, we first identified 1,763 differentially expressed genes between HCC and normal samples. Next, we identified 104 significant HCC-related subpathways. We also identified the subpathways associated with small molecular drugs in the CMap database. Finally, by integrating HCC-related and drug-affected subpathways, we identified 40 novel small molecular drugs capable of targeting these HCC-involved subpathways. In addition to previously reported agents (ie, calmidazolium), our method also identified potentially novel agents for targeting HCC. We experimentally verified that one of these novel agents, prenylamine, induced HCC cell apoptosis using 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide, an acridine orange/ethidium bromide stain, and electron microscopy. In addition, we found that prenylamine not only affected several classic apoptosis-related proteins, including Bax, Bcl-2, and cytochrome c, but also increased caspase-3 activity. These candidate small molecular drugs identified by us may provide insights into novel therapeutic approaches for HCC. PMID:27022281

  12. Applications of nuclear analytical techniques for identifying the origin and composition of found radioactive materials

    International Nuclear Information System (INIS)

    Radioactive materials and sources have been used worldwide for the last 100 years - for medical diagnosis and therapy, industrial imaging and process monitoring, consumer applications, materials and biological research, and for generating nuclear energy - among other peaceful purposes. Many of the radioactive materials have been produced, and the associated nuclear science and technology developed, at major research sites such as the Chalk River Laboratories in Ontario, Canada. Sometimes undocumented radioactive materials associated with production, development or use are found, usually in the context of a legacy setting, and their composition and origin needs to be determined in order for these materials to be safely handled and securely dispositioned. The novel applications of nuclear analytical techniques, including mass spectroscopy, gamma and x-ray spectroscopy and neutron beam irradiation techniques, is presented in the context of some recent investigations. (author)

  13. Identifying and ranking the human resources management criteria influencing on organizational performance using MADM Fuzzy techniques

    OpenAIRE

    Saeed Safari; Mohammad Vazin Karimian; Ali Khosravi

    2014-01-01

    Human resources management plays essential role for the success of organizations. This paper presents an empirical investigation to determine human resource management main criteria and sub-criteria based on a survey on the existing literatures and theoretical principles. The study has been applied in a municipality organization in Iran. The study uses analytical hierarchy process as well as fuzzy technique for order preference by similarity to ideal solution (TOPSIS) for prioritizing decisio...

  14. Methods and Techniques of Sampling, Culturing and Identifying of Subsurface Bacteria

    International Nuclear Information System (INIS)

    This report described sampling, culturing and identifying of KURT underground bacteria, which existed as iron-, manganese-, and sulfate-reducing bacteria. The methods of culturing and media preparation were different by bacteria species affecting bacteria growth-rates. It will be possible for the cultured bacteria to be used for various applied experiments and researches in the future

  15. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  16. Identifying desertification risk areas using fuzzy membership and geospatial technique – A case study, Kota District, Rajasthan

    Indian Academy of Sciences (India)

    Arunima Dasgupta; K L N Sastry; P S Dhinwa; V S Rathore; M S Nathawat

    2013-08-01

    Desertification risk assessment is important in order to take proper measures for its prevention. Present research intends to identify the areas under risk of desertification along with their severity in terms of degradation in natural parameters. An integrated model with fuzzy membership analysis, fuzzy rulebased inference system and geospatial techniques was adopted, including five specific natural parameters namely slope, soil pH, soil depth, soil texture and NDVI. Individual parameters were classified according to their deviation from mean. Membership of each individual values to be in a certain class was derived using the normal probability density function of that class. Thus if a single class of a single parameter is with mean and standard deviation , the values falling beyond + 2 and − 2 are not representing that class, but a transitional zone between two subsequent classes. These are the most important areas in terms of degradation, as they have the lowest probability to be in a certain class, hence highest probability to be extended or narrowed down in next or previous class respectively. Eventually, these are the values which can be easily altered, under extrogenic influences, hence are identified as risk areas. The overall desertification risk is derived by incorporating the different risk severity of each parameter using fuzzy rule-based interference system in GIS environment. Multicriteria based geo-statistics are applied to locate the areas under different severity of desertification risk. The study revealed that in Kota, various anthropogenic pressures are accelerating land deterioration, coupled with natural erosive forces. Four major sources of desertification in Kota are, namely Gully and Ravine erosion, inappropriate mining practices, growing urbanization and random deforestation.

  17. Learning a novel technique to identify possible melanomas: are Australian general practitioners better than their U.K. colleagues?

    Directory of Open Access Journals (Sweden)

    Watson Tony

    2009-04-01

    Full Text Available Abstract Background Spectrophotometric intracutaneous analysis (SIAscopy™ is a multispectral imaging technique that is used to identify 'suspicious' (i.e. potentially malignant pigmented skin lesions for further investigation. The MoleMate™ system is a hand-held scanner that captures SIAscopy™ images that are then classified by the clinician using a computerized diagnostic algorithm designed for the primary health care setting. The objectives of this study were to test the effectiveness of a computer program designed to train health care workers to identify the diagnostic features of SIAscopy™ images and compare the results of a group of Australian and a group of English general practitioners (GPs. Methods Thirty GPs recruited from the Perth (Western Australia metropolitan area completed the training program at a workshop held in March 2008. The accuracy and speed of their pre- and post-test scores were then compared with those of a group of 18 GPs (including 10 GP registrars who completed a similar program at two workshops held in Cambridge (U.K. in March and April, 2007. Results The median test score of the Australian GPs improved from 79.5% to 86.5% (median increase 5.5%; p Conclusion Most of the SIAscopy™ features can be learnt to a reasonable degree of accuracy with this brief computer training program. Although the Australian GPs scored higher in the pre-test, both groups had similar levels of accuracy and speed in interpreting the SIAscopy™ features after completing the program. Scores were not affected by previous dermoscopy experience or dermatology training, which suggests that the MoleMate™ system is relatively easy to learn.

  18. An Analysis of Pyramidal Image Fusion Techniques

    OpenAIRE

    Meek, T. R.

    1999-01-01

    This paper discusses the application of multiresolution image fusion techniques to synthetic aperture radar (SAR) and Landsat imagery. Results were acquired through the development and application of image fusion software to test images. The test images were fused using six image fusion techniques that are the combinations from three types of image decomposition algorithms (ratio of low pass [RoLP] pyramids, gradient pyramids, and morphological pyramids) and two types of fusion algorithms (se...

  19. Cochlear implant simulator for surgical technique analysis

    Science.gov (United States)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  20. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 40K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  1. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  2. Methods and Techniques for miRNA Data Analysis.

    Science.gov (United States)

    Cristiano, Francesca; Veltri, Pierangelo

    2016-01-01

    Genomic data analysis consists of techniques to analyze and extract information from genes. In particular, genome sequencing technologies allow to characterize genomic profiles and identify biomarkers and mutations that can be relevant for diagnosis and designing of clinical therapies. Studies often regard identification of genes related to inherited disorders, but recently mutations and phenotypes are considered both in diseases studies and drug designing as well as for biomarkers identification for early detection.Gene mutations are studied by comparing fold changes in a redundancy version of numeric and string representation of analyzed genes starting from macromolecules. This consists of studying often thousands of repetitions of gene representation and signatures identified by biological available instruments that starting from biological samples generate arrays of data representing nucleotides sequences representing known genes in an often not well-known sequence.High-performance platforms and optimized algorithms are required to manipulate gigabytes of raw data that are generated by the so far mentioned biological instruments, such as NGS (standing for Next-Generation Sequencing) as well as for microarray. Also, data analysis requires the use of several tools and databases that store gene targets as well as gene ontologies and gene-disease association.In this chapter we present an overview of available software platforms for genomic data analysis, as well as available databases with their query engines. PMID:26069024

  3. Market Analysis Identifies Community and School Education Goals.

    Science.gov (United States)

    Lindle, Jane C.

    1989-01-01

    Principals must realize the positive effects that marketing can have on improving schools and building support for them. Market analysis forces clarification of the competing needs and interests present in the community. The four marketing phases are needs assessment, analysis, goal setting, and public relations and advertising. (MLH)

  4. Structural parameter identifiability analysis for dynamic reaction networks

    DEFF Research Database (Denmark)

    Davidescu, Florin Paul; Jørgensen, Sten Bay

    2008-01-01

    A fundamental problem in model identification is to investigate whether unknown parameters in a given model structure potentially can be uniquely recovered from experimental data. This issue of global or structural identifiability is essential during nonlinear first principles model development...... where for a given set of measured variables it is desirable to investigate which parameters may be estimated prior to spending computational effort on the actual estimation. This contribution addresses the structural parameter identifiability problem for the typical case of reaction network models. The...

  5. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  6. Use of Photogrammetry and Biomechanical Gait analysis to Identify Individuals

    DEFF Research Database (Denmark)

    Larsen, Peter Kastmand; Simonsen, Erik Bruun; Lynnerup, Niels

    Photogrammetry and recognition of gait patterns are valuable tools to help identify perpetrators based on surveillance recordings. We have found that stature but only few other measures have a satisfying reproducibility for use in forensics. Several gait variables with high recognition rates were...

  7. Identifying and ranking the human resources management criteria influencing on organizational performance using MADM Fuzzy techniques

    Directory of Open Access Journals (Sweden)

    Saeed Safari

    2014-07-01

    Full Text Available Human resources management plays essential role for the success of organizations. This paper presents an empirical investigation to determine human resource management main criteria and sub-criteria based on a survey on the existing literatures and theoretical principles. The study has been applied in a municipality organization in Iran. The study uses analytical hierarchy process as well as fuzzy technique for order preference by similarity to ideal solution (TOPSIS for prioritizing decision tree criteria. The results indicate job design and human resource planning criteria are ranked as the highest ones. In addition, employee recruitment and selection, employee health and hygiene, training and development and compensation system criteria are other important criteria.

  8. Contribution of radioisotopic techniques to identify sentinel lymph-nodes (SLN) in breast cancer

    International Nuclear Information System (INIS)

    The SLN (one or several) is the first to receive lymph from a tumor. When a cancer cell comes off the tumor and circulates along the outgoing lymph, it meets a barrier, the SLN that intercepts and destroys it. If not, the cancer cell can stay and reproduce in the SLN making a metastasis which can affect other nodes in the same way. It has been shown that if the original tumor is small there is little chance that the SLN could be invaded and therefore little chance of dissemination to other lymph-nodes. Nowadays due to early detection, breast tumors are smaller than one cm, therefore with such size there is little chance of axillary lymph-nodes being affected. If it is confirmed by histological study that the SLN is free of metastasis, it is not necessary to perform a axillary emptying. This identification of SLNs has been achieved because of the advances of Radioisotopic Techniques, which has been carried out in our Hospital since 1997. We have been adapting this technique to the national supply of equipment and radio compounds always under a reliable and secure way. The aim of this presentation is to highlight the radioisotopic identification of SLNs in clinical investigation in 'Angel H. Roffo Institute', and its daily practice compare with Positron Emission Tomography (PET). By combining Radioisotopic Lymphography, Lymphochromography and intra surgical detection of the SN with Gamma Probe, we have obtained a true negative value of 95% of the SN, with 5% false negative. Due to this method we have included SN study in daily practice breast tumor patients with tumor up to 5 cm of diameter. Comparing this methods result (5% false negative), with the PET results, using 18F-FDG, that has 33% false negatives, we conclude that a negative result can not replace this method of SN detection. (author)

  9. Combining Digital Watermarking and Fingerprinting Techniques to Identify Copyrights for Color Images

    Directory of Open Access Journals (Sweden)

    Shang-Lin Hsieh

    2014-01-01

    Full Text Available This paper presents a copyright identification scheme for color images that takes advantage of the complementary nature of watermarking and fingerprinting. It utilizes an authentication logo and the extracted features of the host image to generate a fingerprint, which is then stored in a database and also embedded in the host image to produce a watermarked image. When a dispute over the copyright of a suspect image occurs, the image is first processed by watermarking. If the watermark can be retrieved from the suspect image, the copyright can then be confirmed; otherwise, the watermark then serves as the fingerprint and is processed by fingerprinting. If a match in the fingerprint database is found, then the suspect image will be considered a duplicated one. Because the proposed scheme utilizes both watermarking and fingerprinting, it is more robust than those that only adopt watermarking, and it can also obtain the preliminary result more quickly than those that only utilize fingerprinting. The experimental results show that when the watermarked image suffers slight attacks, watermarking alone is enough to identify the copyright. The results also show that when the watermarked image suffers heavy attacks that render watermarking incompetent, fingerprinting can successfully identify the copyright, hence demonstrating the effectiveness of the proposed scheme.

  10. Rice transcriptome analysis to identify possible herbicide quinclorac detoxification genes

    OpenAIRE

    Xu, Wenying; Di, Chao; Zhou, Shaoxia; Liu, Jia; LI Li; Liu, Fengxia; Yang, Xinling; Ling, Yun; Su, Zhen

    2015-01-01

    Quinclorac is a highly selective auxin-type herbicide and is widely used in the effective control of barnyard grass in paddy rice fields, improving the world's rice yield. The herbicide mode of action of quinclorac has been proposed, and hormone interactions affecting quinclorac signaling has been identified. Because of widespread use, quinclorac may be transported outside rice fields with the drainage waters, leading to soil and water pollution and other environmental health problems. In thi...

  11. Identifying Innovative Interventions to Promote Healthy Eating Using Consumption-Oriented Food Supply Chain Analysis.

    Science.gov (United States)

    Hawkes, Corinna

    2009-07-01

    The mapping and analysis of supply chains is a technique increasingly used to address problems in the food system. Yet such supply chain management has not yet been applied as a means of encouraging healthier diets. Moreover, most policies recommended to promote healthy eating focus on the consumer end of the chain. This article proposes a consumption-oriented food supply chain analysis to identify the changes needed in the food supply chain to create a healthier food environment, measured in terms of food availability, prices, and marketing. Along with established forms of supply chain analysis, the method is informed by a historical overview of how food supply chains have changed over time. The method posits that the actors and actions in the chain are affected by organizational, financial, technological, and policy incentives and disincentives, which can in turn be levered for change. It presents a preliminary example of the supply of Coca-Cola beverages into school vending machines and identifies further potential applications. These include fruit and vegetable supply chains, local food chains, supply chains for health-promoting versions of food products, and identifying financial incentives in supply chains for healthier eating. PMID:23144674

  12. Association analysis identifies ZNF750 regulatory variants in psoriasis

    Directory of Open Access Journals (Sweden)

    Birnbaum Ramon Y

    2011-12-01

    Full Text Available Abstract Background Mutations in the ZNF750 promoter and coding regions have been previously associated with Mendelian forms of psoriasis and psoriasiform dermatitis. ZNF750 encodes a putative zinc finger transcription factor that is highly expressed in keratinocytes and represents a candidate psoriasis gene. Methods We examined whether ZNF750 variants were associated with psoriasis in a large case-control population. We sequenced the promoter and exon regions of ZNF750 in 716 Caucasian psoriasis cases and 397 Caucasian controls. Results We identified a total of 47 variants, including 38 rare variants of which 35 were novel. Association testing identified two ZNF750 haplotypes associated with psoriasis (p ZNF750 promoter and 5' UTR variants displayed a 35-55% reduction of ZNF750 promoter activity, consistent with the promoter activity reduction seen in a Mendelian psoriasis family with a ZNF750 promoter variant. However, the rare promoter and 5' UTR variants identified in this study did not strictly segregate with the psoriasis phenotype within families. Conclusions Two haplotypes of ZNF750 and rare 5' regulatory variants of ZNF750 were found to be associated with psoriasis. These rare 5' regulatory variants, though not causal, might serve as a genetic modifier of psoriasis.

  13. Potential of infrared spectroscopy in combination with extended canonical variate analysis for identifying different paper types

    International Nuclear Information System (INIS)

    The increasing use of secondary fiber in papermaking has led to the production of paper containing a wide range of contaminants. Wastepaper mills need to develop quality control methods for evaluating the incoming wastepaper stock as well as testing the specifications of the final product. The goal of this work is to present a fast and successful methodology for identifying different paper types. In this way, undesirable paper types can be refused, thus improving the runnability of the paper machine and the quality of the paper manufactured. In this work we examine various types of paper using information obtained by an appropriate chemometric treatment of infrared spectral data. For this purpose, we studied a large number of paper sheets of three different types (namely coated, offset and cast-coated) supplied by several paper manufacturers. We recorded Fourier transform infrared (FTIR) spectra with the aid of an attenuated total reflectance (ATR) module and near-infrared (NIR) reflectance spectra by means of fiber optics. Both techniques proved expeditious and required no sample pretreatment. The primary objective of this work was to develop a methodology for the accurate identification of samples of different paper types. For this purpose, we used the chemometric discrimination technique extended canonical variate analysis (ECVA) in combination with the k nearest neighbor (kNN) method to classify samples in the prediction set. Use of the NIR and FTIR techniques under these conditions allowed paper types to be identified with 100% success in prediction samples

  14. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  15. Hybrid chemical and nondestructive-analysis technique

    Energy Technology Data Exchange (ETDEWEB)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities.

  16. Hybrid chemical and nondestructive analysis technique

    International Nuclear Information System (INIS)

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  17. Statistical Analysis Techniques for Small Sample Sizes

    Science.gov (United States)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  18. Identifying Colluvial Slopes by Airborne LiDAR Analysis

    Science.gov (United States)

    Kasai, M.; Marutani, T.; Yoshida, H.

    2015-12-01

    Colluvial slopes are one of major sources of landslides. Identifying the locations of the slopes will help reduce the risk of disasters, by avoiding building infrastructure and properties nearby, or if they are already there, by applying appropriate counter measures before it suddenly moves. In this study, airborne LiDAR data was analyzed to find their geomorphic characteristics to use for extracting their locations. The study site was set in the suburb of Sapporo City, Hokkaido in Japan. The area is underlain by Andesite and Tuff and prone to landslides. Slope angle and surface roughness were calculated from 5 m resolution DEM. These filters were chosen because colluvial materials deposit at around the angle of repose and accumulation of loose materials was considered to form a peculiar surface texture differentiable from other slope types. Field survey conducted together suggested that colluvial slopes could be identified by the filters with a probability of 80 percent. Repeat LiDAR monitoring of the site by an unmanned helicopter indicated that those slopes detected as colluviums appeared to be moving at a slow rate. In comparison with a similar study from the crushed zone in Japan, the range of slope angle indicative of colluviums agreed with the Sapporo site, while the texture was rougher due to larger debris composing the slopes.

  19. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  20. Temperature-based Instanton Analysis: Identifying Vulnerability in Transmission Networks

    Energy Technology Data Exchange (ETDEWEB)

    Kersulis, Jonas [Univ. of Michigan, Ann Arbor, MI (United States); Hiskens, Ian [Univ. of Michigan, Ann Arbor, MI (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bienstock, Daniel [Columbia Univ., New York, NY (United States)

    2015-04-08

    A time-coupled instanton method for characterizing transmission network vulnerability to wind generation fluctuation is presented. To extend prior instanton work to multiple-time-step analysis, line constraints are specified in terms of temperature rather than current. An optimization formulation is developed to express the minimum wind forecast deviation such that at least one line is driven to its thermal limit. Results are shown for an IEEE RTS-96 system with several wind-farms.

  1. Identifying Factors and Techniques to Decrease the Positive Margin Rate in Partial Mastectomies: Have We Missed the Mark?

    Science.gov (United States)

    Edwards, Sara B; Leitman, I Michael; Wengrofsky, Aaron J; Giddins, Marley J; Harris, Emily; Mills, Christopher B; Fukuhara, Shinichi; Cassaro, Sebastiano

    2016-05-01

    Breast conservation therapy (BCT) has a reported incidence of positive margins ranging widely in the literature from 20% to 70%. Efforts have been made to refine standards for partial mastectomy and to predict which patients are at highest risk for incomplete excision. Most have focused on histology and demographics. We sought to further define modifiable risk factors for positive margins and residual disease. A retrospective study was conducted of 567 consecutive partial mastectomies by 21 breast and general surgeons from 2009 to 2012. Four hundred fourteen cases of neoplasm were reviewed for localization, intraoperative assessment, excision technique, rates, and results of re-excision/mastectomy. Histologic margins were positive in 23% of patients, 25% had margins 0.1-0.9 mm, and 7% had tumor within 1-1.9 mm. Residual tumor was identified at-in 61 cases: 38% (disease at margin), 21% (0.1-0.9 mm), and 14% (1-1.9 mm). Ductal carcinoma in situ (DCIS) was present in 85% of residual disease on re-excision and correlated to higher rates of re-excision (p = neoplasms was associated with 2-3 times the likelihood for positive margins than when a single needle was required. The removal of additional margins at initial surgery correlated with improved rates of complete excision when DCIS was present. Patients must have careful analysis of specimen margins at the time of surgery and may benefit from additional tissue excision or routine shaving of the cavity of resection. Surgeons should conduct careful patient selection for BCT, in the context of multifocal, and multicentric disease. Patients for whom tumor localization requires bracketing may be at higher risk for positive margins and residual disease and should be counseled accordingly. PMID:26854189

  2. The use of nominal group technique in identifying community health priorities in Moshi rural district, northern Tanzania

    DEFF Research Database (Denmark)

    Makundi, E A; Manongi, R; Mushi, A K;

    2005-01-01

    This article highlights issues pertaining to identification of community health priorities in a resource poor setting. Community involvement is discussed by drawing experience of involving lay people in identifying priorities in health care through the use of Nominal Group Technique. The identified....... The patients/caregivers, women's group representatives, youth leaders, religious leaders and community leaders/elders constituted the principal subjects. Emphasis was on providing qualitative data, which are of vital consideration in multi-disciplinary oriented studies, and not on quantitative information from....... It is the provision of ownership of the derived health priorities to partners including the community that enhances research utilization of the end results. In addition to disease-based methods, the Nominal Group Technique is being proposed as an important research tool for involving the non-experts in priority...

  3. Identifying Isotropic Events Using an Improved Regional Moment Tensor Inversion Technique

    Energy Technology Data Exchange (ETDEWEB)

    Ford, S R; Dreger, D S; Walter, W R

    2007-07-06

    Using a regional time-domain waveform inversion for the complete moment tensor we calculate the deviatoric and isotropic source components for several explosions at the Nevada Test Site as well as earthquakes, and collapses in the surrounding region of the western US. The events separate into specific populations according to their deviation from a pure double-couple and ratio of isotropic to deviatoric energy. The separation allows for anomalous event identification and discrimination between explosions, earthquakes, and collapses. Error in the moment tensor solutions and source parameters is also calculated. We investigate the sensitivity of the moment tensor solutions to Green's functions calculated with imperfect Earth models, inaccurate event locations, and data with a low signal-to-noise ratio. We also test the performance of the method under a range of recording conditions from excellent azimuthal coverage to cases of sparse station availability, as might be expected for smaller events. Finally, we assess the depth and frequency dependence upon event size. This analysis will be used to determine the range where well-constrained solutions can be obtained.

  4. Predicting missing links and identifying spurious links via likelihood analysis.

    Science.gov (United States)

    Pan, Liming; Zhou, Tao; Lü, Linyuan; Hu, Chin-Kun

    2016-01-01

    Real network data is often incomplete and noisy, where link prediction algorithms and spurious link identification algorithms can be applied. Thus far, it lacks a general method to transform network organizing mechanisms to link prediction algorithms. Here we use an algorithmic framework where a network's probability is calculated according to a predefined structural Hamiltonian that takes into account the network organizing principles, and a non-observed link is scored by the conditional probability of adding the link to the observed network. Extensive numerical simulations show that the proposed algorithm has remarkably higher accuracy than the state-of-the-art methods in uncovering missing links and identifying spurious links in many complex biological and social networks. Such method also finds applications in exploring the underlying network evolutionary mechanisms. PMID:26961965

  5. Paired cost comparison, a benchmarking technique for identifying areas of cost improvement in environmental restoration projects and waste management activities

    International Nuclear Information System (INIS)

    This paper provides an overview of benchmarking and how the Department of Energy's Office of Environmental Restoration and Waste Management used benchmarking techniques, specifically the Paired Cost Comparison, to identify cost disparities and their causes. The paper includes a discussion of the project categories selected for comparison and the criteria used to select the projects. Results are presented and factors that contribute to cost differences are discussed. Also, conclusions and the application of the Paired Cost Comparison are presented

  6. Techniques for identifying the applicability of new information management technologies in the clinical setting: an example focusing on handheld computers.

    OpenAIRE

    Sittig, D. F.; Jimison, H. B.; Hazlehurst, B. L.; Churchill, B. E.; Lyman, J. A.; Mailhot, M. F.; Quick, E. A.; Simpson, D A

    2000-01-01

    This article describes techniques and strategies used to judge the potential applicability of new information management technologies in the clinical setting and to develop specific design recommendations for new features and services. We focus on a project carried out to identify the potential uses of handheld computers (i.e., the Palm Pilot or a small WinCE-based device) in the ambulatory practice setting. We found that the potential for a robust handheld computing device to positively affe...

  7. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  8. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  9. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  10. Messina: a novel analysis tool to identify biologically relevant molecules in disease.

    Directory of Open Access Journals (Sweden)

    Mark Pinese

    Full Text Available BACKGROUND: Morphologically similar cancers display heterogeneous patterns of molecular aberrations and follow substantially different clinical courses. This diversity has become the basis for the definition of molecular phenotypes, with significant implications for therapy. Microarray or proteomic expression profiling is conventionally employed to identify disease-associated genes, however, traditional approaches for the analysis of profiling experiments may miss molecular aberrations which define biologically relevant subtypes. METHODOLOGY/PRINCIPAL FINDINGS: Here we present Messina, a method that can identify those genes that only sometimes show aberrant expression in cancer. We demonstrate with simulated data that Messina is highly sensitive and specific when used to identify genes which are aberrantly expressed in only a proportion of cancers, and compare Messina to contemporary analysis techniques. We illustrate Messina by using it to detect the aberrant expression of a gene that may play an important role in pancreatic cancer. CONCLUSIONS/SIGNIFICANCE: Messina allows the detection of genes with profiles typical of markers of molecular subtype, and complements existing methods to assist the identification of such markers. Messina is applicable to any global expression profiling data, and to allow its easy application has been packaged into a freely-available stand-alone software package.

  11. Using Link Analysis Technique with a Modified Shortest-Path Algorithm to Fight Money Laundering

    Institute of Scientific and Technical Information of China (English)

    CHEN Yunkai; MAI Quanwen; LU Zhengding

    2006-01-01

    Effective link analysis techniques are needed to help law enforcement and intelligence agencies fight money laundering.This paper presents a link analysis technique that uses a modified shortest-path algorithms to identify the strongest association paths between entities in a money laundering network.Based on two-tree Dijkstra and Priority-First-Search (PFS) algorithm, a modified algorithm is presented.To apply the algorithm, a network representation transformation is made first.

  12. Psychoanalytic technique and 'analysis terminable and interminable'.

    Science.gov (United States)

    Sandler, J

    1988-01-01

    Some of the implications for psychoanalytic technique of the papers given at the plenary sessions of the Montreal Congress are considered. Emphasis is placed on the role of affects in development and in current psychic functioning. Motivation for unconscious wishes arises from many sources, and affects should not only be thought of as drive derivatives. There is a substantial gap between the (largely) implicit clinico-technical theories in the analytic work presented, which do in fact show great sensitivity to the patients' affects, and the formal 'official' general psychoanalytic theory used. This discrepancy in our theories should be faced. Freud's tripartite structural theory of the mind (the 'second topography') seems now to have limitations for clinical purposes. PMID:3063676

  13. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  14. Network stratification analysis for identifying function-specific network layers.

    Science.gov (United States)

    Zhang, Chuanchao; Wang, Jiguang; Zhang, Chao; Liu, Juan; Xu, Dong; Chen, Luonan

    2016-04-22

    A major challenge of systems biology is to capture the rewiring of biological functions (e.g. signaling pathways) in a molecular network. To address this problem, we proposed a novel computational framework, namely network stratification analysis (NetSA), to stratify the whole biological network into various function-specific network layers corresponding to particular functions (e.g. KEGG pathways), which transform the network analysis from the gene level to the functional level by integrating expression data, the gene/protein network and gene ontology information altogether. The application of NetSA in yeast and its comparison with a traditional network-partition both suggest that NetSA can more effectively reveal functional implications of network rewiring and extract significant phenotype-related biological processes. Furthermore, for time-series or stage-wise data, the function-specific network layer obtained by NetSA is also shown to be able to characterize the disease progression in a dynamic manner. In particular, when applying NetSA to hepatocellular carcinoma and type 1 diabetes, we can derive functional spectra regarding the progression of the disease, and capture active biological functions (i.e. active pathways) in different disease stages. The additional comparison between NetSA and SPIA illustrates again that NetSA could discover more complete biological functions during disease progression. Overall, NetSA provides a general framework to stratify a network into various layers of function-specific sub-networks, which can not only analyze a biological network on the functional level but also investigate gene rewiring patterns in biological processes. PMID:26879865

  15. Potential of isotope analysis (C, Cl) to identify dechlorination mechanisms

    Science.gov (United States)

    Cretnik, Stefan; Thoreson, Kristen; Bernstein, Anat; Ebert, Karin; Buchner, Daniel; Laskov, Christine; Haderlein, Stefan; Shouakar-Stash, Orfan; Kliegman, Sarah; McNeill, Kristopher; Elsner, Martin

    2013-04-01

    Chloroethenes are commonly used in industrial applications, and detected as carcinogenic contaminants in the environment. Their dehalogenation is of environmental importance in remediation processes. However, a detailed understanding frequently accounted problem is the accumulation of toxic degradation products such as cis-dichloroethylene (cis-DCE) at contaminated sites. Several studies have addressed the reductive dehalogenation reactions using biotic and abiotic model systems, but a crucial question in this context has remained open: Do environmental transformations occur by the same mechanism as in their corresponding in vitro model systems? The presented study shows the potential to close this research gap using the latest developments in compound specific chlorine isotope analysis, which make it possible to routinely measure chlorine isotope fractionation of chloroethenes in environmental samples and complex reaction mixtures.1,2 In particular, such chlorine isotope analysis enables the measurement of isotope fractionation for two elements (i.e., C and Cl) in chloroethenes. When isotope values of both elements are plotted against each other, different slopes reflect different underlying mechanisms and are remarkably insensitive towards masking. Our results suggest that different microbial strains (G. lovleyi strain SZ, D. hafniense Y51) and the isolated cofactor cobalamin employ similar mechanisms of reductive dechlorination of TCE. In contrast, evidence for a different mechanism was obtained with cobaloxime cautioning its use as a model for biodegradation. The study shows the potential of the dual isotope approach as a tool to directly compare transformation mechanisms of environmental scenarios, biotic transformations, and their putative chemical lab scale systems. Furthermore, it serves as an essential reference when using the dual isotope approach to assess the fate of chlorinated compounds in the environment.

  16. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  17. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    Science.gov (United States)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  18. Use of fuzzy techniques for analysis of dynamic loads in power systems

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Presents the use of fuzzy techniques for analysis of dynamic load characteristics of power systems to identify the voltage stability (collapse) of a weak bus and concludes from the consistent results obtained that this is a useful tool for analysis of load charactersitics of sophiscated power systems and their components.

  19. Identifying Clusters of Concepts in a Low Cohesive Class for Extract Class Refactoring Using Metrics Supplemented Agglomerative Clustering Technique

    CERN Document Server

    Rao, A Ananda

    2012-01-01

    Object oriented software with low cohesive classes can increase maintenance cost. Low cohesive classes are likely to be introduced into the software during initial design due to deviation from design principles and during evolution due to software deterioration. Low cohesive class performs operations that should be done by two or more classes. The low cohesive classes need to be identified and refactored using extract class refactoring to improve the cohesion. In this regard, two aspects are involved; the first one is to identify the low cohesive classes and the second one is to identify the clusters of concepts in the low cohesive classes for extract class refactoring. In this paper, we propose metrics supplemented agglomerative clustering technique for covering the above two aspects. The proposed metrics are validated using Weyuker's properties. The approach is applied successfully on two examples and on a case study.

  20. Accelerator based techniques for aerosol analysis

    International Nuclear Information System (INIS)

    At the 3 MV Tandetron accelerator of the LABEC laboratory of INFN (Florence, Italy) an external beam facility is fully dedicated to PIXE-PIGE measurements of elemental composition of atmospheric aerosols. Examples regarding recent monitoring campaigns, performed in urban and remote areas, both on a daily basis and with high time resolution, as well as with size selection, will be presented. It will be evidenced how PIXE can provide unique information in aerosol studies or can play a complementary role to traditional chemical analysis. Finally a short presentation of 14C analysis of the atmospheric aerosol by Accelerator Mass Spectrometry (AMS) for the evaluation of the contributions from either fossil fuel combustion or modern sources (wood burning, biogenic activity) will be given. (author)

  1. Comparison of Commonly Used Accident Analysis Techniques for Manufacturing Industries

    Directory of Open Access Journals (Sweden)

    IRAJ MOHAMMADFAM

    2015-10-01

    Full Text Available The adverse consequences of major accident events have led to development of accident analysis techniques to investigate thoroughly the accidents. However, each technique has its own advantages and shortcomings,which make it very difficult to find a single technique being capable of analyzing all types of accidents. Therefore, the comparison of accident analysis techniques would help finding out their capabilities in different circumstances to choose the most one. In this research, the techniques CBA and AABF were compared with Tripod β in order to determine the superior technique for analysis of major accidents in manufacturing industries. At first step, the comparison criteria were developed using Delphi Method. Afterwards, the relative importance of each criterion was qualitatively determined and the qualitative values were then converted to the quantitative values  applying  Fuzzy  triangular  numbers.  Finally,  the  TOPSIS  was  used  to  prioritize  the techniques in terms of the preset criteria. The results of the study showed that Tripod β is superior to the CBA and AABF. It is highly recommended to compare all available accident analysis techniques based on proper criteria in order to select the best one whereas improper choice of accident analysis techniques may lead to misguided results.

  2. ANALYSIS AND COMPARATIVE STUDY OF SEARCHING TECHNIQUES

    OpenAIRE

    Yuvraj Singh Chandrawat*

    2015-01-01

    We live in the age of technolgy and it is quiet obvious that it is increasing day-by-day endlessly. In this technical era researchers are focusing on the development of the existing technologies. Software engineering is the dominant branch of Computer Science that deals with the development and analysis of the software. The objective of this study is to analyze and compare the existing searching algorithms (linear search and binary search). In this paper, we will discuss both thes...

  3. DNA ANALYSIS OF RICIN USING RAPD TECHNIQUE

    OpenAIRE

    Martin Vivodík; Želmíra Balážová; Zdenka Gálová

    2014-01-01

    Castor (Ricinus communis L.) is an important plant for production of industrial oil. The systematic evaluation of the molecular diversity encompassed in castor inbreds or parental lines offers an efficient means of exploiting the heterosis in castor as well as for management of biodiversity. The aim of this work was to detect genetic variability among the set of 30 castor genotypes using 5 RAPD markers. Amplification of genomic DNA of 30 genotypes, using RAPD analysis, yielded 35 fragments, w...

  4. LIFECYCLE ANALYSIS AS THE CORPORATE ENVIRONMENTAL RESPONSIBILITY ASSESSMENT TECHNIQUE

    OpenAIRE

    Bojan Krstic, Milica Tasic, Vladimir Ivanovic

    2015-01-01

    Lifecycle analysis is one of the techniques for assessing the impact of enterprise on the environment, by monitoring environmental effects of the product along its lifecycle. Since the cycle can be seen in stages (extraction of raw materials, raw materials processing, final product production, product use and end of use of the product), the analysis can be applied to all or only some parts of the aforementioned cycle, hence the different variants of this technique. The analysis itself is defi...

  5. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  6. Meconium microbiome analysis identifies bacteria correlated with premature birth.

    Directory of Open Access Journals (Sweden)

    Alexandria N Ardissone

    Full Text Available Preterm birth is the second leading cause of death in children under the age of five years worldwide, but the etiology of many cases remains enigmatic. The dogma that the fetus resides in a sterile environment is being challenged by recent findings and the question has arisen whether microbes that colonize the fetus may be related to preterm birth. It has been posited that meconium reflects the in-utero microbial environment. In this study, correlations between fetal intestinal bacteria from meconium and gestational age were examined in order to suggest underlying mechanisms that may contribute to preterm birth.Meconium from 52 infants ranging in gestational age from 23 to 41 weeks was collected, the DNA extracted, and 16S rRNA analysis performed. Resulting taxa of microbes were correlated to clinical variables and also compared to previous studies of amniotic fluid and other human microbiome niches.Increased detection of bacterial 16S rRNA in meconium of infants of <33 weeks gestational age was observed. Approximately 61·1% of reads sequenced were classified to genera that have been reported in amniotic fluid. Gestational age had the largest influence on microbial community structure (R = 0·161; p = 0·029, while mode of delivery (C-section versus vaginal delivery had an effect as well (R = 0·100; p = 0·044. Enterobacter, Enterococcus, Lactobacillus, Photorhabdus, and Tannerella, were negatively correlated with gestational age and have been reported to incite inflammatory responses, suggesting a causative role in premature birth.This provides the first evidence to support the hypothesis that the fetal intestinal microbiome derived from swallowed amniotic fluid may be involved in the inflammatory response that leads to premature birth.

  7. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  8. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  9. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  10. Key Point Based Data Analysis Technique

    Science.gov (United States)

    Yang, Su; Zhang, Yong

    In this paper, a new framework for data analysis based on the "key points" in data distribution is proposed. Here, the key points contain three types of data points: bridge points, border points, and skeleton points, where our main contribution is the bridge points. For each type of key points, we have developed the corresponding detection algorithm and tested its effectiveness with several synthetic data sets. Meanwhile, we further developed a new hierarchical clustering algorithm SPHC (Skeleton Point based Hierarchical Clustering) to demonstrate the possible applications of the key points acquired. Based on some real-world data sets, we experimentally show that SPHC performs better compared with several classical clustering algorithms including Complete-Link Hierarchical Clustering, Single-Link Hierarchical Clustering, KMeans, Ncut, and DBSCAN.

  11. Book Review: Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects

    Directory of Open Access Journals (Sweden)

    Thomas Nash

    2013-06-01

    Full Text Available Shavers, B. (2013. Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects. Waltham, MA: Elsevier, 290 pages, ISBN-978-1-59749-985-9, US$51.56. Includes bibliographical references and index.Reviewed by Detective Corporal Thomas Nash (tnash@bpdvt.org, Burlington Vermont Police Department, Internet Crime against Children Task Force. Adjunct Instructor, Champlain College, Burlington VT.In this must read for any aspiring novice cybercrime investigator as well as the seasoned professional computer guru alike, Brett Shaver takes the reader into the ever changing and dynamic world of Cybercrime investigation.  Shaver, an experienced criminal investigator, lays out the details and intricacies of a computer related crime investigation in a clear and concise manner in his new easy to read publication, Placing the Suspect behind the Keyboard. Using Digital Forensics and Investigative techniques to Identify Cybercrime Suspects. Shaver takes the reader from start to finish through each step of the investigative process in well organized and easy to follow sections, with real case file examples to reach the ultimate goal of any investigation: identifying the suspect and proving their guilt in the crime. Do not be fooled by the title. This excellent, easily accessible reference is beneficial to both criminal as well as civil investigations and should be in every investigator’s library regardless of their respective criminal or civil investigative responsibilities.(see PDF for full review

  12. Nuclear fuel lattice performance analysis by data mining techniques

    International Nuclear Information System (INIS)

    Highlights: • This paper shows a data mining application to analyse nuclear fuel lattice designs. • Data mining methods were used to predict if fuel lattices could operate in an adequate way into the BWR reactor core. • Data mining methods learned from fuel lattice datasets simulated with SIMULATE-3. • Results show high recognition percentages of adequate or inadequate fuel lattice performance. - Abstract: In this paper a data mining analysis for BWR nuclear fuel lattice performance is shown. In a typical three-dimensional simulation of the reactor operation simulator gives the core performance for a fuel lattice configuration measured by thermal limits, shutdown margin and produced energy. Based on these results we can determine the number of fulfilled parameters of a fuel lattice configuration. It is interesting to establish a relationship between the fuel lattice properties and the number of fulfilled core parameters in steady state reactor operation. So, with this purpose data mining techniques were used. Results indicate that these techniques are able to predict with enough accuracy (greater than 75%) if a given fuel lattice configuration will have a either “good” or “bad” performance according to reactor core simulation. In this way, they could be coupled with an optimization process to discard fuel lattice configurations with poor performance and, in this way accelerates the optimization process. Data mining techniques apply some filter methods to discard those variables with lower influence in the number of core fulfilled parameter. From this situation, it was also possible to identify a set of variables to be used in new optimization codes with different objective functions than those normally used

  13. High-level power analysis and optimization techniques

    Science.gov (United States)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  14. Comparative evaluation of features and techniques for identifying activity type and estimating energy cost from accelerometer data.

    Science.gov (United States)

    Kate, Rohit J; Swartz, Ann M; Welch, Whitney A; Strath, Scott J

    2016-03-01

    Wearable accelerometers can be used to objectively assess physical activity. However, the accuracy of this assessment depends on the underlying method used to process the time series data obtained from accelerometers. Several methods have been proposed that use this data to identify the type of physical activity and estimate its energy cost. Most of the newer methods employ some machine learning technique along with suitable features to represent the time series data. This paper experimentally compares several of these techniques and features on a large dataset of 146 subjects doing eight different physical activities wearing an accelerometer on the hip. Besides features based on statistics, distance based features and simple discrete features straight from the time series were also evaluated. On the physical activity type identification task, the results show that using more features significantly improve results. Choice of machine learning technique was also found to be important. However, on the energy cost estimation task, choice of features and machine learning technique were found to be less influential. On that task, separate energy cost estimation models trained specifically for each type of physical activity were found to be more accurate than a single model trained for all types of physical activities. PMID:26862679

  15. Analysis and calibration techniques for superconducting resonators

    Science.gov (United States)

    Cataldo, Giuseppe; Wollack, Edward J.; Barrentine, Emily M.; Brown, Ari D.; Moseley, S. Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented.

  16. Identifying Repetitive Institutional Review Board Stipulations by Natural Language Processing and Network Analysis.

    Science.gov (United States)

    Kury, Fabrício S P; Cimino, James J

    2015-01-01

    The corrections ("stipulations") to a proposed research study protocol produced by an institutional review board (IRB) can often be repetitive across many studies; however, there is no standard set of stipulations that could be used, for example, by researchers wishing to anticipate and correct problems in their research proposals prior to submitting to an IRB. The objective of the research was to computationally identify the most repetitive types of stipulations generated in the course of IRB deliberations. The text of each stipulation was normalized using the natural language processing techniques. An undirected weighted network was constructed in which each stipulation was represented by a node, and each link, if present, had weight corresponding to the TF-IDF Cosine Similarity of the stipulations. Network analysis software was then used to identify clusters in the network representing similar stipulations. The final results were correlated with additional data to produce further insights about the IRB workflow. From a corpus of 18,582 stipulations we identified 31 types of repetitive stipulations. Those types accounted for 3,870 stipulations (20.8% of the corpus) produced for 697 (88.7%) of all protocols in 392 (also 88.7%) of all the CNS IRB meetings with stipulations entered in our data source. A notable peroportion of the corrections produced by the IRB can be considered highly repetitive. Our shareable method relied on a minimal manual analysis and provides an intuitive exploration with theoretically unbounded granularity. Finer granularity allowed for the insight that is anticipated to prevent the need for identifying the IRB panel expertise or any human supervision. PMID:26262117

  17. Automated target recognition technique for image segmentation and scene analysis

    Science.gov (United States)

    Baumgart, Chris W.; Ciarcia, Christopher A.

    1994-03-01

    Automated target recognition (ATR) software has been designed to perform image segmentation and scene analysis. Specifically, this software was developed as a package for the Army's Minefield and Reconnaissance and Detector (MIRADOR) program. MIRADOR is an on/off road, remote control, multisensor system designed to detect buried and surface- emplaced metallic and nonmetallic antitank mines. The basic requirements for this ATR software were the following: (1) an ability to separate target objects from the background in low signal-noise conditions; (2) an ability to handle a relatively high dynamic range in imaging light levels; (3) the ability to compensate for or remove light source effects such as shadows; and (4) the ability to identify target objects as mines. The image segmentation and target evaluation was performed using an integrated and parallel processing approach. Three basic techniques (texture analysis, edge enhancement, and contrast enhancement) were used collectively to extract all potential mine target shapes from the basic image. Target evaluation was then performed using a combination of size, geometrical, and fractal characteristics, which resulted in a calculated probability for each target shape. Overall results with this algorithm were quite good, though there is a tradeoff between detection confidence and the number of false alarms. This technology also has applications in the areas of hazardous waste site remediation, archaeology, and law enforcement.

  18. Cepstrum Analysis: An Advanced Technique in Vibration Analysis of Defects in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    M. Satyam

    1994-01-01

    Full Text Available Conventional frequency analysis in machinery vibration is not adequate to find out accurately defects in gears, bearings, and blades where sidebands and harmonics are present. Also such an approach is dependent on the transmission path. On the other hand, cepstrum analysis accurately identifies harmonics and sideband families and is a better technique available for fault diagnosis in gears, bearings, and turbine blades of ships and submarines. Cepstrum represents the global power content of a whole family of harmonics and sidebands when more than one family of sidebands are presents at the same time. Also it is insensitive to the transmission path effects since source and transmission path effects are additive and can be separated in cepstrum. The concept, underlying theory and the measurement and analysis involved for using the technique are briefly outlined. Two cases were taken to demonstrate advantage of cepstrum technique over the spectrum analysis. An LP compressor was chosen to study the transmission path effects and a marine gearbox having two sets of sideband families was studied to diagnose the problematic sideband and its severity.

  19. Medical Image Analysis Using Unsupervised and Supervised Classification Techniques

    OpenAIRE

    Prof. V.Joseph Peter,; Dr. M. Karnan

    2013-01-01

    The evolution of digital computers as well as the development of modern theories for learning and information processing leads to the emergence of Computational Intelligence (CI) engineering. Liver surgery remains a difficult challenge in which preoperative data analysis and strategy definition may play a significant role in the success of the procedure. Extraction of liver fibrosis is done using image enhancement techniques using various filtering techniques, unsupervised clustering techniqu...

  20. CO MPARATIVE STUDY OF CLUSTERING TECHNIQUES IN MULTIVARIATE DATA ANALYSIS

    OpenAIRE

    Sabba Ruhi; Md. Shamim Reza

    2015-01-01

    In present, Clustering techniques is a standard tool in several exploratory pattern - analysis, grouping, decision making, and machine - learning situations; including data mining, document retrieval, image segmentation, pattern recognition and in the field of artificial intelligenc e. In this study we have compared five different types of clustering techniques such as Fuzzy clustering, K - Means clustering, Hierarc...

  1. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  2. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically. The effici...

  3. Differentially expressed genes in pancreatic ductal adenocarcinomas identified through serial analysis of gene expression

    DEFF Research Database (Denmark)

    Hustinx, Steven R; Cao, Dengfeng; Maitra, Anirban;

    2004-01-01

    Serial analysis of gene expression (SAGE) is a powerful tool for the discovery of novel tumor markers. The publicly available online SAGE libraries of normal and neoplastic tissues (http://www.ncbi.nlm.nih.gov/SAGE/) have recently been expanded; in addition, a more complete annotation of the human...... genome and better biocomputational techniques have substantially improved the assignment of differentially expressed SAGE "tags" to human genes. These improvements have provided us with an opportunity to re-evaluate global gene expression in pancreatic cancer using existing SAGE libraries. SAGE libraries...... generated from six pancreatic cancers were compared to SAGE libraries generated from 11 non-neoplastic tissues. Compared to normal tissue libraries, we identified 453 SAGE tags as differentially expressed in pancreatic cancer, including 395 that mapped to known genes and 58 "uncharacterized" tags. Of the...

  4. Confocal Raman data analysis enables identifying apoptosis of MCF-7 cells caused by anticancer drug paclitaxel

    Science.gov (United States)

    Salehi, Hamideh; Middendorp, Elodie; Panayotov, Ivan; Dutilleul, Pierre-Yves Collard; Vegh, Attila-Gergely; Ramakrishnan, Sathish; Gergely, Csilla; Cuisinier, Frederic

    2013-05-01

    Confocal Raman microscopy is a noninvasive, label-free imaging technique used to study apoptosis of live MCF-7 cells. The images are based on Raman spectra of cells components, and their apoptosis is monitored through diffusion of cytochrome c in cytoplasm. K-mean clustering is used to identify mitochondria in cells, and correlation analysis provides the cytochrome c distribution inside the cells. Our results demonstrate that incubation of cells for 3 h with 10 μM of paclitaxel does not induce apoptosis in MCF-7 cells. On the contrary, incubation for 30 min at a higher concentration (100 μM) of paclitaxel induces gradual release of the cytochrome c into the cytoplasm, indicating cell apoptosis via a caspase independent pathway.

  5. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  6. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    OpenAIRE

    Mohammed naved Khan

    2013-01-01

    Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among re...

  7. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  8. Analysis of Maize Crop Leaf using Multivariate Image Analysis for Identifying Soil Deficiency

    Directory of Open Access Journals (Sweden)

    S. Sridevy

    2014-11-01

    Full Text Available Image processing analysis for the soil deficiency identification has become an active area of research in this study. The changes in the color of the leaves are used to analyze and identify the deficiency of soil nutrients such as Nitrogen (N, Phosphorus (P and potassium (K by digital color image analysis. This research study focuses on the image analysis of the maize crop leaf using multivariate image analysis. In this proposed novel approach, initially, a color transformation for the input RGB image is formed and this RGB is converted to HSV because RGB is ideal for color generation but HSV is very suitable for color perception. Then green pixels are masked and removed using specific threshold value by applying histogram equalization. This masking approach is done through specific customized filtering approach which exclusively filters the green color of the leaf. After the filtering step, only the deficiency part of the leaf is taken for consideration. Then, a histogram generation is carried out for the deficiency part of the leaf. Then, Multivariate Image Analysis approach using Independent Component Analysis (ICA is carried out to extract a reference eigenspace from a matrix built by unfolding color data from the deficiency part. Test images are also unfolded and projected onto the reference eigenspace and the result is a score matrix which is used to compute nutrient deficiency based on the T2 statistic. In addition, a multi-resolution scheme by scaling down process is carried out to speed up the process. Finally, based on the training samples, the soil deficiency is identified based on the color of the maize crop leaf.

  9. Study of the aging processes in polyurethane adhesives using thermal treatment and differential calorimetric, dielectric, and mechanical techniques ; 1, identifying the aging processes ; 2, quantifying the aging effect

    CERN Document Server

    Althouse, L P

    1979-01-01

    Study of the aging processes in polyurethane adhesives using thermal treatment and differential calorimetric, dielectric, and mechanical techniques ; 1, identifying the aging processes ; 2, quantifying the aging effect

  10. Analysis of Parametric & Non Parametric Classifiers for Classification Technique using WEKA

    Directory of Open Access Journals (Sweden)

    Yugal kumar

    2012-07-01

    Full Text Available In the field of Machine learning & Data Mining, lot of work had been done to construct new classification techniques/ classifiers and lot of research is going on to construct further new classifiers with the help of nature inspired technique such as Genetic Algorithm, Ant Colony Optimization, Bee Colony Optimization, Neural Network, Particle Swarm Optimization etc. Many researchers provided comparative study/ analysis of classification techniques. But this paper deals with another form of analysis of classification techniques i.e. parametric and non parametric classifiers analysis. This paper identifies parametric & non parametric classifiers that are used in classification process and provides tree representation of these classifiers. For the analysis purpose, four classifiers are used in which two of them are parametric and rest of are non-parametric in nature.

  11. Methylation Linear Discriminant Analysis (MLDA for identifying differentially methylated CpG islands

    Directory of Open Access Journals (Sweden)

    Vass J Keith

    2008-08-01

    Full Text Available Abstract Background Hypermethylation of promoter CpG islands is strongly correlated to transcriptional gene silencing and epigenetic maintenance of the silenced state. As well as its role in tumor development, CpG island methylation contributes to the acquisition of resistance to chemotherapy. Differential Methylation Hybridisation (DMH is one technique used for genome-wide DNA methylation analysis. The study of such microarray data sets should ideally account for the specific biological features of DNA methylation and the non-symmetrical distribution of the ratios of unmethylated and methylated sequences hybridised on the array. We have therefore developed a novel algorithm tailored to this type of data, Methylation Linear Discriminant Analysis (MLDA. Results MLDA was programmed in R (version 2.7.0 and the package is available at CRAN 1. This approach utilizes linear regression models of non-normalised hybridisation data to define methylation status. Log-transformed signal intensities of unmethylated controls on the microarray are used as a reference. The signal intensities of DNA samples digested with methylation sensitive restriction enzymes and mock digested are then transformed to the likelihood of a locus being methylated using this reference. We tested the ability of MLDA to identify loci differentially methylated as analysed by DMH between cisplatin sensitive and resistant ovarian cancer cell lines. MLDA identified 115 differentially methylated loci and 23 out of 26 of these loci have been independently validated by Methylation Specific PCR and/or bisulphite pyrosequencing. Conclusion MLDA has advantages for analyzing methylation data from CpG island microarrays, since there is a clear rational for the definition of methylation status, it uses DMH data without between-group normalisation and is less influenced by cross-hybridisation of loci. The MLDA algorithm successfully identified differentially methylated loci between two classes of

  12. Comparative Analysis of Techniques to Purify Plasma Membrane Proteins

    OpenAIRE

    Weekes, Michael P.; Antrobus, Robin; Lill, Jennie R.; Duncan, Lidia M; Hör, Simon; Lehner, Paul J.

    2010-01-01

    The aim of this project was to identify the best method for the enrichment of plasma membrane (PM) proteins for proteomics experiments. Following tryptic digestion and extended liquid chromatography-tandem mass spectrometry acquisitions, data were processed using MaxQuant and Gene Ontology (GO) terms used to determine protein subcellular localization. The following techniques were examined for the total number and percentage purity of PM proteins identified: (a) whole cell lysate (total numbe...

  13. Hyphenated techniques and their applications in natural products analysis.

    Science.gov (United States)

    Sarker, Satyajit D; Nahar, Lutfun

    2012-01-01

    A technique where a separation technique is coupled with an online spectroscopic detection technology is known as hyphenated technique, e.g., GC-MS, LC-PDA, LC-MS, LC-FTIR, LC-NMR, LC-NMR-MS, and CE-MS. Recent advances in hyphenated analytical techniques have remarkably widened their applications to the analysis of complex biomaterials, especially natural products. This chapter focuses on the applications of hyphenated techniques to pre-isolation and isolation of natural products, dereplication, online partial identification of compounds, chemotaxonomic studies, chemical finger-printing, quality control of herbal products, and metabolomic studies, and presents specific examples. However, a particular emphasis has been given on the hyphenated techniques that involve an LC as the separation tool. PMID:22367902

  14. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  15. Basic Sequence Analysis Techniques for Use with Audit Trail Data

    Science.gov (United States)

    Judd, Terry; Kennedy, Gregor

    2008-01-01

    Audit trail analysis can provide valuable insights to researchers and evaluators interested in comparing and contrasting designers' expectations of use and students' actual patterns of use of educational technology environments (ETEs). Sequence analysis techniques are particularly effective but have been neglected to some extent because of real…

  16. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  17. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  18. A COMPARISON OF STEPWISE AND FUZZY MULTIPLE REGRESSION ANALYSIS TECHNIQUES FOR MANAGING SOFTWARE PROJECT RISKS: ANALYSIS PHASE

    OpenAIRE

    Abdelrafe Elzamly; Burairah Hussin

    2014-01-01

    Risk is not always avoidable, but it is controllable. The aim of this study is to identify whether those techniques are effective in reducing software failure. This motivates the authors to continue the effort to enrich the managing software project risks with consider mining and quantitative approach with large data set. In this study, two new techniques are introduced namely stepwise multiple regression analysis and fuzzy multiple regression to manage the software risks. Two evaluation proc...

  19. Urine metabolomic analysis identifies potential biomarkers and pathogenic pathways in kidney cancer.

    Science.gov (United States)

    Kim, Kyoungmi; Taylor, Sandra L; Ganti, Sheila; Guo, Lining; Osier, Michael V; Weiss, Robert H

    2011-05-01

    Kidney cancer is the seventh most common cancer in the Western world, its incidence is increasing, and it is frequently metastatic at presentation, at which stage patient survival statistics are grim. In addition, there are no useful biofluid markers for this disease, such that diagnosis is dependent on imaging techniques that are not generally used for screening. In the present study, we use metabolomics techniques to identify metabolites in kidney cancer patients' urine, which appear at different levels (when normalized to account for urine volume and concentration) from the same metabolites in nonkidney cancer patients. We found that quinolinate, 4-hydroxybenzoate, and gentisate are differentially expressed at a false discovery rate of 0.26, and these metabolites are involved in common pathways of specific amino acid and energetic metabolism, consistent with high tumor protein breakdown and utilization, and the Warburg effect. When added to four different (three kidney cancer-derived and one "normal") cell lines, several of the significantly altered metabolites, quinolinate, α-ketoglutarate, and gentisate, showed increased or unchanged cell proliferation that was cell line-dependent. Further evaluation of the global metabolomics analysis, as well as confirmation of the specific potential biomarkers using a larger sample size, will lead to new avenues of kidney cancer diagnosis and therapy. PMID:21348635

  20. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  1. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Two brazing alloy samples (C P2 and C P3) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 1011 n/cm2/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 1012 n/cm2/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  2. Clinical education and training: Using the nominal group technique in research with radiographers to identify factors affecting quality and capacity

    International Nuclear Information System (INIS)

    There are a number of group-based research techniques available to determine the views or perceptions of individuals in relation to specific topics. This paper reports on one method, the nominal group technique (NGT) which was used to collect the views of important stakeholders on the factors affecting the quality of, and capacity to provide clinical education and training in diagnostic imaging and radiotherapy and oncology departments in the UK. Inclusion criteria were devised to recruit learners, educators, practitioners and service managers to the nominal groups. Eight regional groups comprising a total of 92 individuals were enrolled; the numbers in each group varied between 9 and 13. A total of 131 items (factors) were generated across the groups (mean = 16.4). Each group was then asked to select the top three factors from their original list. Consensus on the important factors amongst groups found that all eight groups agreed on one item: staff attitude, motivation and commitment to learners. The 131 items were organised into themes using content analysis. Five main categories and a number of subcategories emerged. The study concluded that the NGT provided data which were congruent with the issues faced by practitioners and learners in their daily work; this was of vital importance if the findings are to be regarded with credibility. Further advantages and limitations of the method are discussed, however it is argued that the NGT is a useful technique to gather relevant opinion; to select priorities and to reach consensus on a wide range of issues

  3. Design, data analysis and sampling techniques for clinical research

    OpenAIRE

    Karthik Suresh; Thomas, Sanjeev V.; Geetha Suresh

    2011-01-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains...

  4. Microarray Analysis Techniques Singular Value Decomposition and Principal Component Analysis

    CERN Document Server

    Wall, M E; Rocha, L M; Wall, Michael E.; Rechtsteiner, Andreas; Rocha, Luis M.

    2002-01-01

    This chapter describes gene expression analysis by Singular Value Decomposition (SVD), emphasizing initial characterization of the data. We describe SVD methods for visualization of gene expression data, representation of the data using a smaller number of variables, and detection of patterns in noisy gene expression data. In addition, we describe the precise relation between SVD analysis and Principal Component Analysis (PCA) when PCA is calculated using the covariance matrix, enabling our descriptions to apply equally well to either method. Our aim is to provide definitions, interpretations, examples, and references that will serve as resources for understanding and extending the application of SVD and PCA to gene expression analysis.

  5. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the...

  6. What's down below? Current and potential future applications of geophysical techniques to identify subsurface permafrost conditions (Invited)

    Science.gov (United States)

    Douglas, T. A.; Bjella, K.; Campbell, S. W.

    2013-12-01

    For infrastructure design, operations, and maintenance requirements in the North the ability to accurately and efficiently detect the presence (or absence) of ground ice in permafrost terrains is a serious challenge. Ground ice features including ice wedges, thermokarst cave-ice, and segregation ice are present in a variety of spatial scales and patterns. Currently, most engineering applications use borehole logging and sampling to extrapolate conditions at the point scale. However, there is high risk of over or under estimating the presence of frozen or unfrozen features when relying on borehole information alone. In addition, boreholes are costly, especially for planning linear structures like roads or runways. Predicted climate warming will provide further challenges for infrastructure development and transportation operations where permafrost degradation occurs. Accurately identifying the subsurface character in permafrost terrains will allow engineers and planners to cost effectively create novel infrastructure designs to withstand the changing environment. There is thus a great need for a low cost rapidly deployable, spatially extensive means of 'measuring' subsurface conditions. Geophysical measurements, both terrestrial and airborne, have strong potential to revolutionize our way of mapping subsurface conditions. Many studies in continuous and discontinuous permafrost have used geophysical measurements to identify discrete features and repeatable patterns in the subsurface. The most common measurements include galvanic and capacitive coupled resistivity, ground penetrating radar, and multi frequency electromagnetic induction techniques. Each of these measurements has strengths, weaknesses, and limitations. By combining horizontal geophysical measurements, downhole geophysics, multispectral remote sensing images, LiDAR measurements, and soil and vegetation mapping we can start to assemble a holistic view of how surface conditions and standoff measurements

  7. The analysis of training needs: Methods and techniques

    OpenAIRE

    V. Carbone

    2002-01-01

    After placing the phenomenon within its conceptual framework, the following section will present the development of policies to anticipate training needs in European countries. The next stage will consist in interpreting needs analysis as a planning tool to aid those involved in training and it is with reference to this definition that the most widely used techniques for TNA are addressed, examining the factors that determine the choice of the proper technique. Subsequently, attention will be...

  8. Targeting Ion Beam Analysis techniques for gold artefacts

    OpenAIRE

    Demortier, Guy

    2012-01-01

    The present study discusses the best experimental conditions for the quantitative analysis of gold jewellery artefacts by ion beam techniques (PIXE, RBS, PIGE and NRA). Special attention is given to the detection of enhancement or depletion below the surface, down to 10 microns, without any sampling or destruction. PIXE is certainly the most interesting technique for this purpose and the optimal geometrical arrangement of the experiment is described: orientation of the incident beam relative ...

  9. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  10. An integrated technique for the analysis of skin bite marks.

    Science.gov (United States)

    Bernitz, Herman; Owen, Johanna H; van Heerden, Willie F P; Solheim, Tore

    2008-01-01

    The high number of murder, rape, and child abuse cases in South Africa has led to increased numbers of bite mark cases being heard in high courts. Objective analysis to match perpetrators to bite marks at crime scenes must be able to withstand vigorous cross-examination to be of value in conviction of perpetrators. An analysis technique is described in four stages, namely determination of the mark to be a human bite mark, pattern association analysis, metric analysis and comparison with the population data, and illustrated by a real case study. New and accepted techniques are combined to determine the likelihood ratio of guilt expressed as one of a range of conclusions described in the paper. Each stage of the analysis adds to the confirmation (or rejection) of concordance between the dental features present on the victim and the dentition of the suspect. The results illustrate identification to a high degree of certainty. PMID:18279256

  11. Dietary separation of sympatric carnivores identified by molecular analysis of scats.

    Science.gov (United States)

    Farrell, L E; Roman, J; Sunquist, M E

    2000-10-01

    We studied the diets of four sympatric carnivores in the flooding savannas of western Venezuela by analysing predator DNA and prey remains in faeces. DNA was isolated and a portion of the cytochrome b gene of the mitochondrial genome amplified and sequenced from 20 of 34 scats. Species were diagnosed by comparing the resulting sequences to reference sequences generated from the blood of puma (Puma concolor), jaguar (Panthera onca), ocelot (Leopardus pardalus) and crab-eating fox (Cerdocyon thous). Scat size has previously been used to identify predators, but DNA data show that puma and jaguar scats overlap in size, as do those of puma, ocelot and fox. Prey-content analysis suggests minimal prey partitioning between pumas and jaguars. In field testing this technique for large carnivores, two potential limitations emerged: locating intact faecal samples and recovering DNA sequences from samples obtained in the wet season. Nonetheless, this study illustrates the tremendous potential of DNA faecal studies. The presence of domestic dog (Canis familiaris) in one puma scat and of wild pig (Sus scrofa), set as bait, in one jaguar sample exemplifies the forensic possibilities of this noninvasive analysis. In addition to defining the dietary habits of similar size sympatric mammals, DNA identifications from faeces allow wildlife managers to detect the presence of endangered taxa and manage prey for their conservation. PMID:11050553

  12. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  13. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  14. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  15. Dynamic analysis of large structures by modal synthesis techniques.

    Science.gov (United States)

    Hurty, W. C.; Hart, G. C.; Collins, J. D.

    1971-01-01

    Several criteria that may be used to evaluate the merits of some of the existing techniques for the dynamic analysis of large structures which involve division into substructures or components are examined. These techniques make use of component displacement modes to synthetize global systems of generalized coordinates and, for that reason, they have come to be known as modal synthesis or component mode methods. Two techniques have been found to be particularly useful - i.e., the modal synthesis method with fixed attachment modes, and the modal synthesis method with free attachment modes. These two methods are treated in detail, and general flow charts are presented for guidance in computer programming.

  16. Automated local bright feature image analysis of nuclear protein distribution identifies changes in tissue phenotype

    International Nuclear Information System (INIS)

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues

  17. Automated local bright feature image analysis of nuclear proteindistribution identifies changes in tissue phenotype

    Energy Technology Data Exchange (ETDEWEB)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-02-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues.

  18. Differential analysis of ovarian and endometrial cancers identifies a methylator phenotype.

    Directory of Open Access Journals (Sweden)

    Diana L Kolbe

    Full Text Available Despite improved outcomes in the past 30 years, less than half of all women diagnosed with epithelial ovarian cancer live five years beyond their diagnosis. Although typically treated as a single disease, epithelial ovarian cancer includes several distinct histological subtypes, such as papillary serous and endometrioid carcinomas. To address whether the morphological differences seen in these carcinomas represent distinct characteristics at the molecular level we analyzed DNA methylation patterns in 11 papillary serous tumors, 9 endometrioid ovarian tumors, 4 normal fallopian tube samples and 6 normal endometrial tissues, plus 8 normal fallopian tube and 4 serous samples from TCGA. For comparison within the endometrioid subtype we added 6 primary uterine endometrioid tumors and 5 endometrioid metastases from uterus to ovary. Data was obtained from 27,578 CpG dinucleotides occurring in or near promoter regions of 14,495 genes. We identified 36 locations with significant increases or decreases in methylation in comparisons of serous tumors and normal fallopian tube samples. Moreover, unsupervised clustering techniques applied to all samples showed three major profiles comprising mostly normal samples, serous tumors, and endometrioid tumors including ovarian, uterine and metastatic origins. The clustering analysis identified 60 differentially methylated sites between the serous group and the normal group. An unrelated set of 25 serous tumors validated the reproducibility of the methylation patterns. In contrast, >1,000 genes were differentially methylated between endometrioid tumors and normal samples. This finding is consistent with a generalized regulatory disruption caused by a methylator phenotype. Through DNA methylation analyses we have identified genes with known roles in ovarian carcinoma etiology, whereas pathway analyses provided biological insight to the role of novel genes. Our finding of differences between serous and endometrioid

  19. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  20. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations, 1: Review and Comparison of Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-03-24

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type 11errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples.

  1. Characterisation of solar cells by ion beam analysis techniques

    International Nuclear Information System (INIS)

    Several ion beam analysis techniques were applied for the characterisation of amorphous (a- Si) and polycrystalline silicon solar cells. Thickness and composition of thin layers in thin film a-Si cells were analysed by RBS (Rutherford backscattering) using 5 MeV Li beam and ERDA (Elastic recoil detection analysis) using 12 MeV C beam. Nuclear microprobe technique IBIC (Ion beam induced charge) was used for imaging a charge collection efficiency of EFG (edge-defined film-fed grown) silicon in attempt to correlate charge loss with a spatial distribution of structural defects in the material. (author)

  2. Micro analysis of disolved gases by the gas chromatography technique

    International Nuclear Information System (INIS)

    A technique which allows the quantitative analysis of small concentration of disolved gases such as CO2 and H2 in the order of 10-6 - 10-3M is discussed. For the extraction, separation and quantification a Toepler pump was used. This is in tandem to a gas chromatography. This method also can be applied for the analysis of other gases like CO, CH4, CH3-CH3 etc. This technique may be applied in fields such as radiation chemistry, oceanography and environmental studies. (author)

  3. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  4. VIBRATION ANALYSIS ON A COMPOSITE BEAM TO IDENTIFY DAMAGE AND DAMAGE SEVERITY USING FINITE ELEMENT METHOD

    Directory of Open Access Journals (Sweden)

    E.V.V.Ramanamurthy

    2011-07-01

    Full Text Available The objective of this paper is to develop a damage detection method in a composite cantilever beam with an edge crack has been studied using finite element method. A number of analytical, numerical andexperimental techniques are available for the study of damage identification in beams. Studies were carried out for three different types of analysis on a composite cantilever beam with an edge crack as damage. The material used in this analysis is glass-epoxy composite material. The finite element formulation was carried out in the analysis section of the package, known as ANSYS. The types of vibration analysis studied on a composite beam are Modal, Harmonic andTransient analysis. The crack is modeled such that the cantilever beam is replaced with two intact beams with the crack as additional boundary condition. Damage algorithms are used to identify and locate the damage. Damage index method is also used to find the severity of the damage. The results obtained from modal analysis were compared with the transient analysis results.The vibration-based damage detection methods are based on the fact that changes of physical properties (stiffness, mass and damping due to damage will manifest themselves as changes in the structural modal parameters (natural frequencies, mode shapes and modal damping. The task is then to monitor the selected indicators derived from modal parameters to distinguish between undamaged and damaged states. However, the quantitative changes of global modal parameters are not sufficiently sensitive to a local damage. The proposed approach, on the other hand, interprets the dynamic changes caused by damage in a different way. Although the basis for vibration-based damage detection appears intuitive, the implementation in real structures may encounter many significant challenges. The most fundamental issue is the fact that damage typically is a local phenomenon and may not dramatically influence the global dynamic response of a

  5. Identifying and Prioritizing Effective Factors on Classifying A Private Bank Customers by Delphi Technique and Analytical Hierarchy Process (AHP

    Directory of Open Access Journals (Sweden)

    S. Khayatmoghadam

    2013-05-01

    Full Text Available Banking industry development and presence of different financial institutions cause to increase competition in customer and their capitals attraction so that there are about 28 banks and many credit and financial institutions from which 6 banks are public and 22 banks are private. Among them, public banks have a more appropriate situation than private banks with regard to governmental relations and support and due to geographical expansion and longer history. But due to lack of above conditions; private banks try to attract customers with regarding science areas to remedy this situation. Therefore, in this study we are decided to review banking customers from a different viewpoint. For this reason, we initially obtained ideal indications from banking viewpoint in two-story of uses and resources customers using experts and Delphi technique application which based on this, indicators such as account workflow, account average, lack of returned cheque, etc and in uses section, the amount of facility received, the amount of received warranties, etc, were determined. Then, using a Hierarchical Analysis (AHP method and experts opinions through software Expert Choice11, priority of these criteria were determined and weight of each index was determined. It should be noted that statistical population of bank experts associated with this study were queue and staff. Also obtained results can be used as input for customer grouping in line with CRM techniques implementation.

  6. Applications of Geophysical and Geological Techniques to Identify Areas for Detailed Exploration in Black Mesa Basin, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    George, S.; Reeves, T.K.; Sharma, Bijon; Szpakiewicz, M.

    1999-04-29

    A recent report submitted to the U.S. Department of Energy (DOE) (NIPER/BDM-0226) discussed in considerable detail, the geology, structure, tectonics, and history of oil production activities in the Black Mesa basin in Arizona. As part of the final phase of wrapping up research in the Black Mesa basin, the results of a few additional geophysical studies conducted on structure, stratigraphy, petrophysical analysis, and oil and gas occurrences in the basin are presented here. A second objective of this study is to determine the effectiveness of relatively inexpensive, noninvasive techniques like gravity or magnetic in obtaining information on structure and tectonics in sufficient detail for hydrocarbon exploration, particularly by using the higher resolution satellite data now becoming available to the industry.

  7. Data analysis techniques for nuclear and particle physicists

    CERN Document Server

    Pruneau, Claude

    2016-01-01

    This is an advanced data analysis textbook for scientists specializing in the areas of particle physics, nuclear physics, and related subfields. As a practical guide for robust, comprehensive data analysis, it focuses on realistic techniques to explain instrumental effects. The topics are relevant for engineers, scientists, and astroscientists working in the fields of geophysics, chemistry, and the physical sciences. The book serves as a reference for more senior scientists while being eminently accessible to advanced undergraduate and graduate students.

  8. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  9. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  10. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  11. Detection and analysis of explosives by nuclear techniques

    International Nuclear Information System (INIS)

    In today's global environment of international terrorism, there is a well recognised need for sensitive and specific techniques for detection and analysis of explosives. Sensitivity is needed for detection of small amounts of explosives hidden or mixed with post explosion residues. Specificity is needed in order to avoid response, to non-explosive substances. The conventional techniques for detection of explosives based on vapour detection either by sniffer dogs or by some instrumental vapour detectors have limitations because of their inability to detect plastic bonded explosives which has vapour pressure significantly lower than pure explosives. The paper reviews the various nuclear techniques such as thermal, fast and pulsed fast neutron activation analysis which are used for detection of pure explosives, mixtures of explosives, and also aluminized and plastic bonded explosives. (author)

  12. Review of geographic processing techniques applicable to regional analysis

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, R.C.

    1988-02-01

    Since the early 1970s regional environmental studies have been carried out at the Oak Ridge National Laboratory using computer-assisted techniques. This paper presents an overview of some of these past experiences and the capabilities developed at the Laboratory for processing, analyzing, and displaying geographic data. A variety of technologies have resulted such as computer cartography, image processing, spatial modeling, computer graphics, data base management, and geographic information systems. These tools have been used in a wide range of spatial applications involving facility siting, transportation routing, coal resource analysis, environmental impacts, terrain modeling, inventory development, demographic studies, water resource analyses, etc. The report discusses a number of topics dealing with geographic data bases and structures, software and processing techniques, hardware systems, models and analysis tools, data acquisition techniques, and graphical display methods. Numerous results from many different applications are shown to aid the reader interested in using geographic information systems for environmental analyses. 15 refs., 64 figs., 2 tabs.

  13. Analysis of Dynamic Road Traffic Congestion Control (DRTCC Techniques

    Directory of Open Access Journals (Sweden)

    Pardeep Mittal

    2015-10-01

    Full Text Available : Dynamic traffic light control at intersection has become one of the most active research areas to develop the Dynamic transportation systems (ITS. Due to the consistent growth in urbanization and traffic congestion, such a system was required which can control the timings of traffic lights dynamically with accurate measurement of traffic on the road. In this paper, analysis of all the techniques that has been developed to automate the traffic lights has been done.. The efficacy of all the techniques has been evaluated, using MATLAB software. After comparison of artificial intelligent techniques , it is found that image mosaicking technique is quite effective (in terms of improving moving time and reducing waiting time for the control of the traffic signals to control congestion on the road.

  14. Neutron noise analysis techniques in nuclear power reactors

    International Nuclear Information System (INIS)

    The main techniques used in neutron noise analysis of BWR and PWR nuclear reactors are reviewed. Several applications such as control of vibrations in both reactor types, determination of two phase flow parameters in BWR and stability control in BWR are discussed with some detail. The paper contains many experimental results obtained by the main author of this paper. (author)

  15. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  16. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    Science.gov (United States)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  17. Identifying a core set of medical informatics serials: an analysis using the MEDLINE database.

    OpenAIRE

    Sittig, D. F.

    1996-01-01

    A study was undertaken to test the hypothesis that a core set of medical informatics serials could be identified by using standard bibliometric techniques. All journal articles indexed by the National Library of Medicine between 1990 and 1994 were included. Articles were identified by using the "MEDICAL INFORMATICS" Medical Subject Heading (MeSH) term. Each serial title containing articles was then ranked according to (1) the total number of medical informatics journal articles indexed and (2...

  18. X-rays and other associated techniques in identifying organic compounds for the production of synthetic graphite

    International Nuclear Information System (INIS)

    Graphite making organic compounds such as polynuclear aromatics, high rank coal always pass through a liquid or plastic state-structural transition of optical anisotropy, called carbonaceous mesophase, the life time of which is limited by its hardening to semicoke, x-ray analysis shows that the inter layer spacing of graphite carbons decreases with increasing temperature and becomes 3.354 A or nearly so in the graphitization temperature range 2500 degree C To 3000 degree C. Sensitive tint technique of polarized-light microscopy has been found most suitable to study the initial formation of spherules, their coalescence and the growth of mosaic texture during the mesophase period. Differential thermal analysis (DTA) trace, having an initial large endotherm with activation energy of the order of 60 kcal/mole or above, has been proved to be another effective tool for detecting graphitizable organic materials and in determining the mesophase intervals. A sharp fall in resistivity with temperature is found to be another indicator for the graphitizable organic materials exhibiting semi-conducting behavior. (Author)

  19. Role of nuclear analytical techniques in environmental analysis

    International Nuclear Information System (INIS)

    Nuclear analytical techniques play an important role in the protection of human health from biological, chemical and radiological hazards in the environment. They are highly useful in the areas of environmental sciences such as air pollution, environmental chemistry, environmental management, environmental toxicology, industrial hygiene, marine pollution and water quality. The socio-economic needs of all nations are closely linked to its industrial activities, with energy sector as the driving force, making it sustainable without causing much constraints to the nature's ability to meet the present and future needs. At the same time it should be realized that most of the industrial production options are accrued with some degree of environmental impacts in the form of increased concentrations in air, soil, vegetation, surface and ground water resources, marine coastal zones, sediment etc., which inter alia can lead to potential health and environmental risks through inhalation and ingestion pathways. In the nuclear facilities, most of its harmful effects have been minimized by control measures like shielding, engineered safety systems, environmental surveillance etc., and remain under check due to the strict implementation of the comprehensive, continuous and stringent regulatory system. The development of nuclear analytical techniques is based on utilization of certain properties of nucleus and associated with the phenomena of ionizing radiations. The analytical techniques that use nuclear instrumentation are also called nuclear analytical techniques. Nuclear analytical techniques are basically divided in two categories i.e. those are based on direct methods and those based on indirect methods. Direct methods include beta counting, gamma spectrometry, alpha spectrometry, emanometry etc. while Indirect methods include Instrumental Neutron Activation Analysis, Radiochemical Neutron Activation Analysis, Prompt Gamma Analysis, Charged Particle Activation Analysis

  20. An Electrochemical Impedance Spectroscopy-Based Technique to Identify and Quantify Fermentable Sugars in Pineapple Waste Valorization for Bioethanol Production.

    Science.gov (United States)

    Conesa, Claudia; García-Breijo, Eduardo; Loeff, Edwin; Seguí, Lucía; Fito, Pedro; Laguarda-Miró, Nicolás

    2015-01-01

    Electrochemical Impedance Spectroscopy (EIS) has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%). These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring. PMID:26378537

  1. Profile likelihood ratio analysis techniques for rare event signals

    CERN Document Server

    Billard, J

    2013-01-01

    The Cryogenic Dark Matter Search (CDMS) II uses crystals operated at milliKelvin temperature to search for dark matter. We present the details of the profile likelihood analysis of 140.2 kg-day exposure from the final data set of the CDMS II Si detectors that revealed three WIMP-candidate events. We found that this result favors a WIMP+background hypothesis over the known-background-only hypothesis at the 99.81% confidence level. This paper is dedicated to the description of the profile likelihood analysis dedicated to the CDMSII-Si data and discusses such analysis techniques in the scope of rare event searches.

  2. Investigation of total reflection X-ray fluorescence analysis technique

    International Nuclear Information System (INIS)

    Total-Reflection X-ray Fluorescence spectrometry (TRXF) is known for its high sensitivity down to Pg-level or sub ppb level, respectively. Therefore the spectrometry is considered as a most competitive tool in the application of trace element analysis. The technique of TRXF was investigated in the laboratory. But small isotope X-γ source is chosen as an exciting source instead of general X-ray tube. From the primitive experiment the conclusion proved that the condition of total reflection can be built and the analysis sensitivity of TRXF is higher than that of normal x-ray analysis

  3. RCAUSE – A ROOT CAUSE ANALYSIS MODEL TO IDENTIFY THE ROOT CAUSES OF SOFTWARE REENGINEERING PROBLEMS

    Directory of Open Access Journals (Sweden)

    Er. Anand Rajavat

    2011-01-01

    Full Text Available Organizations that wish to modernize their legacy systems, must adopt a financial viable evolution strategy to gratify the needs of modern business environment. There are various options available to modernize legacy system in to more contemporary system. Over the last few years’ legacy system reengineering has emerged as a popular system modernization technique. The reengineering generally focuses on the increased productivity and quality of the system. However many of these efforts are often less than successful because they only concentrate on symptoms of software reengineering risk without targeting root causes of those risk. A subjective assessment (diagnosis of software reengineering risk from different domain of legacy system is required to identify the root causes of those risks. The goal of this paper is to highlight root causes of software reengineering risk. We proposed a root cause analysis model RCause that classify root causes of software reengineering risk in to three distinctive but connected areas of interest i.e. system domain, managerial domain and technical domain. .

  4. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits......The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...

  5. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  6. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  7. Microfluidic IEF technique for sequential phosphorylation analysis of protein kinases

    Science.gov (United States)

    Choi, Nakchul; Song, Simon; Choi, Hoseok; Lim, Bu-Taek; Kim, Young-Pil

    2015-11-01

    Sequential phosphorylation of protein kinases play the important role in signal transduction, protein regulation, and metabolism in living cells. The analysis of these phosphorylation cascades will provide new insights into their physiological functions in many biological functions. Unfortunately, the existing methods are limited to analyze the cascade activity. Therefore, we suggest a microfluidic isoelectric focusing technique (μIEF) for the analysis of the cascade activity. Using the technique, we show that the sequential phosphorylation of a peptide by two different kinases can be successfully detected on a microfluidic chip. In addition, the inhibition assay for kinase activity and the analysis on a real sample have also been conducted. The results indicate that μIEF is an excellent means for studies on phosphorylation cascade activity.

  8. Nondestructive analysis of oil shales with PGNAA technique

    International Nuclear Information System (INIS)

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water

  9. Nondestructive analysis of oil shales with PGNAA technique

    Energy Technology Data Exchange (ETDEWEB)

    Maly, J.; Bozorgmanesh, H.

    1984-02-01

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water.

  10. A Review on Clustering and Outlier Analysis Techniques in Datamining

    Directory of Open Access Journals (Sweden)

    S. Koteeswaran

    2012-01-01

    Full Text Available Problem statement: The modern world is based on using physical, biological and social systems more effectively using advanced computerized techniques. A great amount of data being generated by such systems; it leads to a paradigm shift from classical modeling and analyses based on basic principles to developing models and the corresponding analyses directly from data. The ability to extract useful hidden knowledge in these data and to act on that knowledge is becoming increasingly important in today's competitive world. Approach: The entire process of applying a computer-based methodology, including new techniques, for discovering knowledge from data is called data mining. There are two primary goals in the data mining which are prediction and classification. The larger data involved in the data mining requires clustering and outlier analysis for reducing as well as collecting only useful data set. Results: This study is focusing the review of implementation techniques, recent research on clustering and outlier analysis. Conclusion: The study aims for providing the review of clustering and outlier analysis technique and the discussion on the study will guide the researcher for improving their research direction.

  11. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  12. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  13. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    OpenAIRE

    Nanhay Singh; Achin Jain; Ram Shringar Raw

    2013-01-01

    Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper...

  14. Rain Attenuation Analysis using Synthetic Storm Technique in Malaysia

    Science.gov (United States)

    Lwas, A. K.; Islam, Md R.; Chebil, J.; Habaebi, M. H.; Ismail, A. F.; Zyoud, A.; Dao, H.

    2013-12-01

    Generated rain attenuation time series plays an important role for investigating the rain fade characteristics in the lack of real fade measurements. A suitable conversion technique can be applied to measured rain rate time series to produce rain attenuation data and be utilized to understand the rain fade characteristics. This paper focuses on applicability of synthetic storm technique (SST) to convert measured rain rate data to rain attenuation time series. Its performance is assessed for time series generation over a tropical location Kuala Lumpur, in Malaysia. From preliminary analysis, it is found that SST gives satisfactory results to estimate the rain attenuation time series from the rain rate measurements over this region.

  15. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  16. Identifying Patients Who Are Unsuitable for Accelerated Partial Breast Irradiation Using Three-dimensional External Beam Conformal Techniques

    International Nuclear Information System (INIS)

    Purpose: Several recent studies reported that severe late toxicities including soft-tissue fibrosis and fat necrosis are present in patients treated with accelerated partial breast irradiation (APBI) and that these toxicities are associated with the large volume of tissue targeted by high-dose irradiation. The present study was performed to clarify which patients are unsuitable for APBI to avoid late severe toxicities. Methods and Materials: Study subjects comprised 50 consecutive patients with Stage 0−II unilateral breast cancer who underwent breast-conserving surgery, and in whom five or six surgical clips were placed during surgery. All patients were subsequently replanned using three-dimensional conformal radiotherapy (3D-CRT) APBI techniques according to the National Surgical Adjuvant Breast and Bowel Project (NSABP) B-39 and Radiation Therapy Oncology Group (RTOG) 0413 protocol. The beam arrangements included mainly noncoplanar four- or five-field beams using 6-MV photons alone. Results: Dose–volume histogram (DVH) constraints for normal tissues according to the NSABP/RTOG protocol were satisfied in 39 patients (78%). Multivariate analysis revealed that only long craniocaudal clip distance (CCD) was correlated with nonoptimal DVH constraints (p = 0.02), but that pathological T stage, anteroposterior clip distance (APD), site of ipsilateral breast (IB) (right/left), location of the tumor (medial/lateral), and IB reference volume were not. DVH constraints were satisfied in 20% of patients with a long CCD (≥5.5 cm) and 92% of those with a short CCD (p 50) of all patients was 49.0% (range, 31.4–68.6). Multivariate analysis revealed that only a long CCD was correlated with large IB-V50 (p 50.

  17. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  18. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Charlton, William S

    1999-09-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels.

  19. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  20. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  1. Large areas elemental mapping by ion beam analysis techniques

    Science.gov (United States)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  2. Neural network technique for identifying prognostic anomalies from low-frequency electromagnetic signals in the Kuril-Kamchatka region

    Science.gov (United States)

    Popova, I.; Rozhnoi, A.; Solovieva, M.; Levin, B.; Chebrov, V.

    2016-03-01

    In this paper, we suggest a technique for forecasting seismic events based on the very low and low frequency (VLF and LF) signals in the 10 to 50 Hz band using the neural network approach, specifically, the error back-propagation method (EBPM). In this method, the solution of the problem has two main stages: training and recognition (forecasting). The training set is constructed from the combined data, including the amplitudes and phases of the VLF/LF signals measured in the monitoring of the Kuril-Kamchatka region and the corresponding parameters of regional seismicity. Training the neural network establishes the internal relationship between the characteristic changes in the VLF/LF signals a few days before a seismic event and the corresponding level of seismicity. The trained neural network is then applied in a prognostic mode for automated detection of the anomalous changes in the signal which are associated with seismic activity exceeding the assumed threshold level. By the example of several time intervals in 2004, 2005, 2006, and 2007, we demonstrate the efficiency of the neural network approach in the short-term forecasting of earthquakes with magnitudes starting from M ≥ 5.5 from the nighttime variations in the amplitudes and phases of the LF signals on one radio path. We also discuss the results of the simultaneous analysis of the VLF/LF data measured on two partially overlapping paths aimed at revealing the correlations between the nighttime variations in the amplitude of the signal and seismic activity.

  3. Pattern recognition and data mining techniques to identify factors in wafer processing and control determining overlay error

    Science.gov (United States)

    Lam, Auguste; Ypma, Alexander; Gatefait, Maxime; Deckers, David; Koopman, Arne; van Haren, Richard; Beltman, Jan

    2015-03-01

    On-product overlay can be improved through the use of context data from the fab and the scanner. Continuous improvements in lithography and processing performance over the past years have resulted in consequent overlay performance improvement for critical layers. Identification of the remaining factors causing systematic disturbances and inefficiencies will further reduce overlay. By building a context database, mappings between context, fingerprints and alignment & overlay metrology can be learned through techniques from pattern recognition and data mining. We relate structure (`patterns') in the metrology data to relevant contextual factors. Once understood, these factors could be moved to the known effects (e.g. the presence of systematic fingerprints from reticle writing error or lens and reticle heating). Hence, we build up a knowledge base of known effects based on data. Outcomes from such an integral (`holistic') approach to lithography data analysis may be exploited in a model-based predictive overlay controller that combines feedback and feedforward control [1]. Hence, the available measurements from scanner, fab and metrology equipment are combined to reveal opportunities for further overlay improvement which would otherwise go unnoticed.

  4. Path Analysis and Causal Analysis: Variations of a Multivariate Technique of Measuring Student Politics

    Science.gov (United States)

    Braungart, Richard G.

    1975-01-01

    This paper tests a multivariate theory of family status, socialization, and student politics employing two different methodological techniques: (1) the popular path analysis method as compared with (2) a modified causal analysis approach. Results reveal that both techniques appear to be a reliable check on one another. (Editor/PG)

  5. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist;

    2015-01-01

    structure evaluation by assessing the local identifiability characteristics of the parameters. Moreover, such a procedure should be generic to make sure it can be applied independent from the structure of the model. We hereby apply a numerical identifiability approach which is based on the work of Walter...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring....... In contrast, the practical identifiability analysis revealed that high values of the forward rate parameter Vf led to identifiability problems. These problems were even more pronounced athigher substrate concentrations, which illustrates the importance of a proper experimental designto avoid...

  6. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  7. Demonstration of statistical approaches to identify component's ageing by operational data analysis-A case study for the ageing PSA network

    International Nuclear Information System (INIS)

    The paper presents some results of a case study on 'Demonstration of statistical approaches to identify the component's ageing by operational data analysis', which was done in the frame of the EC JRC Ageing PSA Network. Several techniques: visual evaluation, nonparametric and parametric hypothesis tests, were proposed and applied in order to demonstrate the capacity, advantages and limitations of statistical approaches to identify the component's ageing by operational data analysis. Engineering considerations are out of the scope of the present study

  8. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis. PMID:24114889

  9. Analysis of Acoustic Emission Signals using WaveletTransformation Technique

    Directory of Open Access Journals (Sweden)

    S.V. Subba Rao

    2008-07-01

    Full Text Available Acoustic emission (AE monitoring is carried out during proof pressure testing of pressurevessels to find the occurrence of any crack growth-related phenomenon. While carrying out AEmonitoring, it is often found that the background noise is very high. Along with the noise, thesignal includes various phenomena related to crack growth, rubbing of fasteners, leaks, etc. Dueto the presence of noise, it becomes difficult to identify signature of the original signals related to the above phenomenon. Through various filtering/ thresholding techniques, it was found that the original signals were getting filtered out along with noise. Wavelet transformation technique is found to be more appropriate to analyse the AE signals under such situations. Wavelet transformation technique is used to de-noise the AE data. The de-noised signal is classified to identify a signature based on the type of phenomena.Defence Science Journal, 2008, 58(4, pp.559-564, DOI:http://dx.doi.org/10.14429/dsj.58.1677

  10. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  11. Network analysis of translocated Takahe populations to identify disease surveillance targets.

    Science.gov (United States)

    Grange, Zoë L; VAN Andel, Mary; French, Nigel P; Gartrell, Brett D

    2014-04-01

    Social network analysis is being increasingly used in epidemiology and disease modeling in humans, domestic animals, and wildlife. We investigated this tool in describing a translocation network (area that allows movement of animals between geographically isolated locations) used for the conservation of an endangered flightless rail, the Takahe (Porphyrio hochstetteri). We collated records of Takahe translocations within New Zealand and used social network principles to describe the connectivity of the translocation network. That is, networks were constructed and analyzed using adjacency matrices with values based on the tie weights between nodes. Five annual network matrices were created using the Takahe data set, each incremental year included records of previous years. Weights of movements between connected locations were assigned by the number of Takahe moved. We calculated the number of nodes (i(total)) and the number of ties (t(total)) between the nodes. To quantify the small-world character of the networks, we compared the real networks to random graphs of the equivalent size, weighting, and node strength. Descriptive analysis of cumulative annual Takahe movement networks involved determination of node-level characteristics, including centrality descriptors of relevance to disease modeling such as weighted measures of in degree (k(i)(in)), out degree (k(i)(out)), and betweenness (B(i)). Key players were assigned according to the highest node measure of k(i)(in), k(i)(out), and B(i) per network. Networks increased in size throughout the time frame considered. The network had some degree small-world characteristics. Nodes with the highest cumulative tie weights connecting them were the captive breeding center, the Murchison Mountains and 2 offshore islands. The key player fluctuated between the captive breeding center and the Murchison Mountains. The cumulative networks identified the captive breeding center every year as the hub of the network until the final

  12. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  13. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group the...... methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between...

  14. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    Science.gov (United States)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  15. Consolidating metabolite identifiers to enable contextual and multi-platform metabolomics data analysis

    Directory of Open Access Journals (Sweden)

    Saito Kazuki

    2010-04-01

    Full Text Available Abstract Background Analysis of data from high-throughput experiments depends on the availability of well-structured data that describe the assayed biomolecules. Procedures for obtaining and organizing such meta-data on genes, transcripts and proteins have been streamlined in many data analysis packages, but are still lacking for metabolites. Chemical identifiers are notoriously incoherent, encompassing a wide range of different referencing schemes with varying scope and coverage. Online chemical databases use multiple types of identifiers in parallel but lack a common primary key for reliable database consolidation. Connecting identifiers of analytes found in experimental data with the identifiers of their parent metabolites in public databases can therefore be very laborious. Results Here we present a strategy and a software tool for integrating metabolite identifiers from local reference libraries and public databases that do not depend on a single common primary identifier. The program constructs groups of interconnected identifiers of analytes and metabolites to obtain a local metabolite-centric SQLite database. The created database can be used to map in-house identifiers and synonyms to external resources such as the KEGG database. New identifiers can be imported and directly integrated with existing data. Queries can be performed in a flexible way, both from the command line and from the statistical programming environment R, to obtain data set tailored identifier mappings. Conclusions Efficient cross-referencing of metabolite identifiers is a key technology for metabolomics data analysis. We provide a practical and flexible solution to this task and an open-source program, the metabolite masking tool (MetMask, available at http://metmask.sourceforge.net, that implements our ideas.

  16. Analysis of dynamic conflicts by techniques of artificial intelligence

    OpenAIRE

    Shinar, Josef

    1989-01-01

    Dynamic conflicts exhibit differentiel game characteristics and their analysis by any method which disregards this feature may be, by definition, futile. Unfortunately, realistic conflicts may have an intricate information structure and a complex hierarchy which don't fit in the classical differential game formulation. Moreover, in many cases even well formulated differential games are not solvable. In the recent years great progress has been made in artificial intelligence techniques, put in...

  17. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    OpenAIRE

    Demyanenko DV; Demyanenko VG; Breusova SV

    2016-01-01

    Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS) causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and...

  18. Analysis of Dynamic Road Traffic Congestion Control (DRTCC) Techniques

    OpenAIRE

    Pardeep Mittal; Yashpal Singh,; Yogesh Sharma

    2015-01-01

    : Dynamic traffic light control at intersection has become one of the most active research areas to develop the Dynamic transportation systems (ITS). Due to the consistent growth in urbanization and traffic congestion, such a system was required which can control the timings of traffic lights dynamically with accurate measurement of traffic on the road. In this paper, analysis of all the techniques that has been developed to automate the traffic lights has been done.. The efficacy...

  19. Calcium Hardness Analysis of Water Samples Using EDXRF Technique

    Directory of Open Access Journals (Sweden)

    Kanan Deep

    2014-08-01

    Full Text Available Calcium hardness of water samples has been determined using a method based upon the Energy Dispersive X-ray fluorescence (EDXRF technique for elemental analysis. The minimum detection limit for Ca has been found in the range 0.1-100ppm. The experimental approach and analytical method for calcium studies seem satisfactory for the purpose and can be utilized for similar investigations.

  20. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  1. Nuclear Spectral Analysis with Nonlinear Robust Fitting Techniques

    International Nuclear Information System (INIS)

    A new approach to nuclear spectral analysis based on nonlinear robust fitting techniques is described. Some of the fundamental differences in this approach from the more common methods of nuclear spectral analysis involving local peak searches are discussed. Although it requires knowledgeable interactive operation for best results and is computationally intensive, nuclear spectral analysis with nonlinear robust fitting has been shown to be capable of exceptional sensitivity in detecting weak radionuclides in the presence of strong interference and in noisy spectra, sparse spectra, and low-resolution spectra. This increased sensitivity is due to the simultaneous optimization of all the data for all the free variables of the analysis and the iterative construction of a well-determined continuum spanning the entire spectrum

  2. Nuclear Spectral Analysis with Nonlinear Robust Fitting Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lasche, G.P.; Coldwell, R.L.

    2001-06-17

    A new approach to nuclear spectral analysis based on nonlinear robust fitting techniques is described. Some of the fundamental differences in this approach from the more common methods of nuclear spectral analysis involving local peak searches are discussed. Although it requires knowledgeable interactive operation for best results and is computationally intensive, nuclear spectral analysis with nonlinear robust fitting has been shown to be capable of exceptional sensitivity in detecting weak radionuclides in the presence of strong interference and in noisy spectra, sparse spectra, and low-resolution spectra. This increased sensitivity is due to the simultaneous optimization of all the data for all the free variables of the analysis and the iterative construction of a well-determined continuum spanning the entire spectrum.

  3. History of activation analysis technique with charged particles in Uzbekistan

    International Nuclear Information System (INIS)

    Full text: The researches on activation analysis with charged particles (CPAA) were started immediately after beginning of constructing of 150-cm cyclotron U-150 in 60-th years of last century. CPAA laboratory organized on bases of the cyclotron and neutron generator NG-200 (in following I-150) in 1971 existed up to the end of 1985. We have used Ion beams of these devices to elaborate two types of nuclear analysis techniques: 1. Delayed Nuclear Analysis (DNA) involving Charged Particle Activation Analysis (CPAA) and Fast Neutron Activation Analysis (FNAA); 2. Prompt Nuclear Analysis (PNA) involving the spectrometry of particles induced X-Ray emission (PIXE). DNA with using accelerators has the following subdivisions: 1. Proton Activation Analysis (PAA); 2. Deuteron Activation Analysis (DAA); 3. 3He Activation Analysis (3HeAA); 4. 4He Activation Analysis (4HeAA or α-AA); 5. Fast Neutron Activation Analysis (FNAA). PAA and DAA found wide application were used to derive a good sensitivity in determination of contents of more than 20 chemical elements in some materials of high purity. For example, we have applied these techniques for the determination of Li, B, C, N, O, F at level of 10-8 - 10-10 g/g in different high purity semiconductors (Si, SiC, Ge, AsGa, InP et al.), nonferrous metals (Li, Be, Zr, Nb, Mo, Ta, W, Re, Al, Ti etc.), nonconductive materials (different glasses, optical materials, diamonds et al.) and environmental objects (soil, plants, water). The techniques provided good results on the determination of B, C and N contents and others. 3HeAA and 4HeAA were generally used to determine of O and C contents in semiconductors ands metals of high purity. We have elaborated rapid radiochemical techniques for separation of short-lived positron emitters. For example, the separation of 15O, formatting by nuclear reaction 16O(3He,α)15O, the reducing fusion technique was used. Radionuclide 11C was separated chemically by the oxidisation of samples in the

  4. New laser spectroscopic technique for stable-isotope ratio analysis

    International Nuclear Information System (INIS)

    A new approach to stable-isotope ratio analysis based on atomic hyperfine structure is demonstrated. This laser spectroscopic scheme is virtually interference-free. A minor constituent in a complex matrix can be selectively analyzed without extensive sample preparation. A single-frequency tunable cw ring dye laser is used as the excitation source and a demountable cathode discharge is used as the atomizer and detector. Samples are electrodeposited on the demountable cathode and hyperfine profiles are collected by optogalvanic detection. By spectral deconvolution, the relative abundances of all isotopes present can be determined with good accuracy and precision. The technique is demonstrated for copper contents as low as 1.6 ppM, using the atomic hyperfine structure of Cu I 578.2 nm non-resonance transition. It is also successfully tested for analysis of copper isotopes in human blood. The sensitivity of doppler-free polarization spectroscopy in atomic flames is showed to be competitive with other sensitive laser techniques such as the fluorescence spectrometric methods. Improved detectability of polarization rotation and excellent suppression of flame background noise enable this method to achieve detection limits of parts per trillion levels of sodium and 37 ppB of barium. The spectral resolution is suitable for isotopic analysis, and the technique offers excellent selectivity and minimum spectral interference

  5. Empirical Analysis of Data Mining Techniques for Social Network Websites

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2014-02-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  6. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2015-11-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  7. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  8. Identifying Patients Who Are Unsuitable for Accelerated Partial Breast Irradiation Using Three-dimensional External Beam Conformal Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Shikama, Naoto, E-mail: nshikama0525@gmail.com [Department of Radiation Oncology, Saitama Medical University International Medical Center, Saitama (Japan); Nakamura, Naoki; Kunishima, Naoaki; Hatanaka, Shogo; Sekiguchi, Kenji [Department of Radiation Oncology, St. Luke' s International Hospital, Tokyo (Japan)

    2012-07-01

    Purpose: Several recent studies reported that severe late toxicities including soft-tissue fibrosis and fat necrosis are present in patients treated with accelerated partial breast irradiation (APBI) and that these toxicities are associated with the large volume of tissue targeted by high-dose irradiation. The present study was performed to clarify which patients are unsuitable for APBI to avoid late severe toxicities. Methods and Materials: Study subjects comprised 50 consecutive patients with Stage 0-II unilateral breast cancer who underwent breast-conserving surgery, and in whom five or six surgical clips were placed during surgery. All patients were subsequently replanned using three-dimensional conformal radiotherapy (3D-CRT) APBI techniques according to the National Surgical Adjuvant Breast and Bowel Project (NSABP) B-39 and Radiation Therapy Oncology Group (RTOG) 0413 protocol. The beam arrangements included mainly noncoplanar four- or five-field beams using 6-MV photons alone. Results: Dose-volume histogram (DVH) constraints for normal tissues according to the NSABP/RTOG protocol were satisfied in 39 patients (78%). Multivariate analysis revealed that only long craniocaudal clip distance (CCD) was correlated with nonoptimal DVH constraints (p = 0.02), but that pathological T stage, anteroposterior clip distance (APD), site of ipsilateral breast (IB) (right/left), location of the tumor (medial/lateral), and IB reference volume were not. DVH constraints were satisfied in 20% of patients with a long CCD ({>=}5.5 cm) and 92% of those with a short CCD (p < 0.0001). Median IB reference volume receiving {>=}50% of the prescribed dose (IB-V{sub 50}) of all patients was 49.0% (range, 31.4-68.6). Multivariate analysis revealed that only a long CCD was correlated with large IB-V{sub 50} (p < 0.0001), but other factors were not. Conclusion: Patients with long CCDs ({>=}5.5 cm) might be unsuitable for 3D-CRT APBI because of nonoptimal DVH constraints and large IB

  9. Statistical techniques to construct assays for identifying likely responders to a treatment under evaluation from cell line genomic data

    Directory of Open Access Journals (Sweden)

    Shi Xiaoyan

    2010-10-01

    Full Text Available Abstract Background Developing the right drugs for the right patients has become a mantra of drug development. In practice, it is very difficult to identify subsets of patients who will respond to a drug under evaluation. Most of the time, no single diagnostic will be available, and more complex decision rules will be required to define a sensitive population, using, for instance, mRNA expression, protein expression or DNA copy number. Moreover, diagnostic development will often begin with in-vitro cell-line data and a high-dimensional exploratory platform, only later to be transferred to a diagnostic assay for use with patient samples. In this manuscript, we present a novel approach to developing robust genomic predictors that are not only capable of generalizing from in-vitro to patient, but are also amenable to clinically validated assays such as qRT-PCR. Methods Using our approach, we constructed a predictor of sensitivity to dacetuzumab, an investigational drug for CD40-expressing malignancies such as lymphoma using genomic measurements of cell lines treated with dacetuzumab. Additionally, we evaluated several state-of-the-art prediction methods by independently pairing the feature selection and classification components of the predictor. In this way, we constructed several predictors that we validated on an independent DLBCL patient dataset. Similar analyses were performed on genomic measurements of breast cancer cell lines and patients to construct a predictor of estrogen receptor (ER status. Results The best dacetuzumab sensitivity predictors involved ten or fewer genes and accurately classified lymphoma patients by their survival and known prognostic subtypes. The best ER status classifiers involved one or two genes and led to accurate ER status predictions more than 85% of the time. The novel method we proposed performed as well or better than other methods evaluated. Conclusions We demonstrated the feasibility of combining feature

  10. Statistical techniques to construct assays for identifying likely responders to a treatment under evaluation from cell line genomic data

    International Nuclear Information System (INIS)

    Developing the right drugs for the right patients has become a mantra of drug development. In practice, it is very difficult to identify subsets of patients who will respond to a drug under evaluation. Most of the time, no single diagnostic will be available, and more complex decision rules will be required to define a sensitive population, using, for instance, mRNA expression, protein expression or DNA copy number. Moreover, diagnostic development will often begin with in-vitro cell-line data and a high-dimensional exploratory platform, only later to be transferred to a diagnostic assay for use with patient samples. In this manuscript, we present a novel approach to developing robust genomic predictors that are not only capable of generalizing from in-vitro to patient, but are also amenable to clinically validated assays such as qRT-PCR. Using our approach, we constructed a predictor of sensitivity to dacetuzumab, an investigational drug for CD40-expressing malignancies such as lymphoma using genomic measurements of cell lines treated with dacetuzumab. Additionally, we evaluated several state-of-the-art prediction methods by independently pairing the feature selection and classification components of the predictor. In this way, we constructed several predictors that we validated on an independent DLBCL patient dataset. Similar analyses were performed on genomic measurements of breast cancer cell lines and patients to construct a predictor of estrogen receptor (ER) status. The best dacetuzumab sensitivity predictors involved ten or fewer genes and accurately classified lymphoma patients by their survival and known prognostic subtypes. The best ER status classifiers involved one or two genes and led to accurate ER status predictions more than 85% of the time. The novel method we proposed performed as well or better than other methods evaluated. We demonstrated the feasibility of combining feature selection techniques with classification methods to develop assays

  11. Crime Analysis Using Geoinformatics Technique and Hotspot Detection for Akola City, Maharashtra State, India

    OpenAIRE

    Khadri; S.F.R; Chaitanya Pande; Kanak Moharir

    2013-01-01

    The need of effective utilization geoinformatics technique has been providing city safety kind with tools to analyze and interpret such relations through GIS software. Recently, there has been increase in crimes of various types in Akola city. To prepare Maps offer crime analysis and graphic representations of crime-related issues. An understanding of where crimes occur can improve attempts to fight crime. The present study identified various crime patterns in Akola city and covers aspect of ...

  12. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C14 n-alkanes to C40 to C30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  13. Using Quantitative Data Analysis Techniques for Bankruptcy Risk Estimation for Corporations

    Directory of Open Access Journals (Sweden)

    Ştefan Daniel ARMEANU

    2012-01-01

    Full Text Available Diversification of methods and techniques for quantification and management of risk has led to the development of many mathematical models, a large part of which focused on measuring bankruptcy risk for businesses. In financial analysis there are many indicators which can be used to assess the risk of bankruptcy of enterprises but to make an assessment it is needed to reduce the number of indicators and this can be achieved through principal component, cluster and discriminant analyses techniques. In this context, the article aims to build a scoring function used to identify bankrupt companies, using a sample of companies listed on Bucharest Stock Exchange.

  14. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    Directory of Open Access Journals (Sweden)

    Akshay Amolik

    2015-12-01

    Full Text Available Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment analysis is tricky as compared to broad sentiment analysis because of the slang words and misspellings and repeated characters. We know that the maximum length of each tweet in Twitter is 140 characters. So it is very important to identify correct sentiment of each word. In our project we are proposing a highly accurate model of sentiment analysis of tweets with respect to latest reviews of upcoming Bollywood or Hollywood movies. With the help of feature vector and classifiers such as Support vector machine and Naïve Bayes, we are correctly classifying these tweets as positive, negative and neutral to give sentiment of each tweet.

  15. Transcriptome Analysis of Syringa oblata Lindl. Inflorescence Identifies Genes Associated with Pigment Biosynthesis and Scent Metabolism.

    Directory of Open Access Journals (Sweden)

    Jian Zheng

    Full Text Available Syringa oblata Lindl. is a woody ornamental plant with high economic value and characteristics that include early flowering, multiple flower colors, and strong fragrance. Despite a long history of cultivation, the genetics and molecular biology of S. oblata are poorly understood. Transcriptome and expression profiling data are needed to identify genes and to better understand the biological mechanisms of floral pigments and scents in this species. Nine cDNA libraries were obtained from three replicates of three developmental stages: inflorescence with enlarged flower buds not protruded, inflorescence with corolla lobes not displayed, and inflorescence with flowers fully opened and emitting strong fragrance. Using the Illumina RNA-Seq technique, 319,425,972 clean reads were obtained and were assembled into 104,691 final unigenes (average length of 853 bp, 41.75% of which were annotated in the NCBI non-redundant protein database. Among the annotated unigenes, 36,967 were assigned to gene ontology categories and 19,956 were assigned to eukaryoticorthologous groups. Using the Kyoto Encyclopedia of Genes and Genomes pathway database, 12,388 unigenes were sorted into 286 pathways. Based on these transcriptomic data, we obtained a large number of candidate genes that were differentially expressed at different flower stages and that were related to floral pigment biosynthesis and fragrance metabolism. This comprehensive transcriptomic analysis provides fundamental information on the genes and pathways involved in flower secondary metabolism and development in S. oblata, providing a useful database for further research on S. oblata and other plants of genus Syringa.

  16. Potential Coastal Pumped Hydroelectric Energy Storage Locations Identified using GIS-based Topographic Analysis

    Science.gov (United States)

    Parsons, R.; Barnhart, C. J.; Benson, S. M.

    2013-12-01

    Large-scale electrical energy storage could accommodate variable, weather dependent energy resources such as wind and solar. Pumped hydroelectric energy storage (PHS) and compressed energy storage area (CAES) have life cycle energy and financial costs that are an order of magnitude lower than conventional electrochemical storage technologies. However PHS and CAES storage technologies require specific geologic conditions. Conventional PHS requires an upper and lower reservoir separated by at least 100 m of head, but no more than 10 km in horizontal distance. Conventional PHS also impacts fresh water supplies, riparian ecosystems, and hydrologic environments. A PHS facility that uses the ocean as the lower reservoir benefits from a smaller footprint, minimal freshwater impact, and the potential to be located near off shore wind resources and population centers. Although technologically nascent, today one coastal PHS facility exists. The storage potential for coastal PHS is unknown. Can coastal PHS play a significant role in augmenting future power grids with a high faction of renewable energy supply? In this study we employ GIS-based topographic analysis to quantify the coastal PHS potential of several geographic locations, including California, Chile and Peru. We developed automated techniques that seek local topographic minima in 90 m spatial resolution shuttle radar topography mission (SRTM) digital elevation models (DEM) that satisfy the following criteria conducive to PHS: within 10 km from the sea; minimum elevation 150 m; maximum elevation 1000 m. Preliminary results suggest the global potential for coastal PHS could be very significant. For example, in northern Chile we have identified over 60 locations that satisfy the above criteria. Two of these locations could store over 10 million cubic meters of water or several GWh of energy. We plan to report a global database of candidate coastal PHS locations and to estimate their energy storage capacity.

  17. Privacy-Preserving Data Analysis Techniques by using different modules

    Directory of Open Access Journals (Sweden)

    Payal P. Wasankar

    2013-11-01

    Full Text Available The competing parties who have private data may collaboratively conduct privacy preserving distributed data analysis (PPDA tasks to learn beneficial data models or analysis results. For example, different credit card companies may try to build better models for credit card fraud detection through PPDA tasks. Similarly, competing companies in the same industry may try to combine their sales data to build models that may predict the future sales. In many of these cases, the competing parties have different incentives. Although certain PPDA techniques guarantee that nothing other than the final analysis result is revealed, it is impossible to verify whether or not participating parties are truthful about their private input data.

  18. Requirements Analyses Integrating Goals and Problem Analysis Techniques

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    One of the difficulties that goal-oriented requirements analyses encounters is that the efficiency of the goal refinement is based on the analysts' subjective knowledge and experience. To improve the efficiency of the requirements eiicitation process, engineers need approaches with more systemized analysis techniques. This paper integrates the goal-oriented requirements language i* with concepts from a structured problem analysis notation, problem frames (PF). The PF approach analyzes software design as a contextualized problem which has to respond to constraints imposed by the environment. The proposed approach is illustrated using the meeting scheduler exemplar. Results show that integration of the goal and the problem analysis enables simultaneous consideration of the designer's subjective intentions and the physical environmental constraints.

  19. Summary of recent AAEC research on noise analysis techniques

    International Nuclear Information System (INIS)

    The research establishment of the AAEC has, over the last decade, developed a comprehensive data analysis facility capable of dealing with random signals in the range 0 to 300 k Hz. All the conventional spectral and correlation functions can be estimated from either analogue or digital signals. This facility together with a wide range of available sensors have been used to detect record and analysis data derived from real and simulated nuclear plant. Although the main emphasis of the work has been to develop an experimental capability and acquire basic skills in noise analysis techniques, some application has been made to real life practical problems. The following is a brief summary of work carried out at Lucas Heights during the period 1977 to 1980. The comments are intentionally concise as reference to detailed papers are given

  20. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  1. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  2. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  3. Soil Data Analysis Using Classification Techniques and Soil Attribute Prediction

    Directory of Open Access Journals (Sweden)

    Jay Gholap

    2012-05-01

    Full Text Available Agricultural research has been profited by technical advances such as automation, data mining. Today ,data mining is used in a vast areas and many off-the-shelf data mining system products and domain specific data mining application soft wares are available, but data mining in agricultural soil datasets is a relatively a young research field. The large amounts of data that are nowadays virtually harvested along with the crops have to be analyzed and should be used to their full extent. This research aims at analysis of soil dataset using data mining techniques. It focuses on classification of soil using various algorithms available. Another important purpose is to predict untested attributes using regression technique, and implementation of automated soil sample classification.

  4. Comparative Analysis of Partial Occlusion Using Face Recognition Techniques

    Directory of Open Access Journals (Sweden)

    N.Nallammal

    2013-04-01

    Full Text Available This paper presents a comparison of partial occlusion using face recognition techniques that gives in which technique produce better result for total success rate. The partial occlusion of face recognition is especially useful for people where part of their face is scarred and defect thus need to be covered. Hence, either top part/eye region or bottom part of face will be recognized respectively. The partial face information are tested with Principle Component Analysis (PCA, Non-negative matrix factorization (NMF, Local NMF (LNMF and Spatially Confined NMF (SFNMF. The comparative results show that the recognition rate of 95.17% with r = 80 by using SFNMF for bottom face region. On the other hand, eye region achieves 95.12% with r = 10 by using LNMF.

  5. Gamma absorption technique in elemental analysis of composite materials

    International Nuclear Information System (INIS)

    Highlights: ► Application of gamma-ray absorption technique in elemental analysis. ► Determination of elemental composition of some bronze and gold alloys. ► Determination of some heavy elements in water. - Abstract: Expressions for calculating the elemental concentrations of composite materials based on a gamma absorption technique are derived. These expressions provide quantitative information about elemental concentrations of materials. Calculations are carried out for estimating the concentrations of copper and gold in some alloys of bronze and gold. The method was also applied for estimating the concentrations of some heavy elements in a water matrix highlighting the differences with photon attenuation measurements. Theoretical mass attenuation coefficient values were obtained using the WinXCom program. A high-resolution gamma-ray spectrometry based on high purity germanium detector (HPGe) was employed to measure the attenuation of a strongly collimated monoenergetic gamma beam through samples.

  6. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  7. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author)

  8. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Science.gov (United States)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  9. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  10. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  11. Identifying disease candidate genes via large-scale gene network analysis.

    Science.gov (United States)

    Kim, Haseong; Park, Taesung; Gelenbe, Erol

    2014-01-01

    Gene Regulatory Networks (GRN) provide systematic views of complex living systems, offering reliable and large-scale GRNs to identify disease candidate genes. A reverse engineering technique, Bayesian Model Averaging-based Networks (BMAnet), which ensembles all appropriate linear models to tackle uncertainty in model selection that integrates heterogeneous biological data sets is introduced. Using network evaluation metrics, we compare the networks that are thus identified. The metric 'Random walk with restart (Rwr)' is utilised to search for disease genes. In a simulation our method shows better performance than elastic-net and Gaussian graphical models, but topological quantities vary among the three methods. Using real-data, brain tumour gene expression samples consisting of non-tumour, grade III and grade IV are analysed to estimate networks with a total of 4422 genes. Based on these networks, 169 brain tumour-related candidate genes were identified and some were found to relate to 'wound', 'apoptosis', and 'cell death' processes. PMID:25796737

  12. Identifying Effective Spelling Interventions Using a Brief Experimental Analysis and Extended Analysis

    Science.gov (United States)

    McCurdy, Merilee; Clure, Lynne F.; Bleck, Amanda A.; Schmitz, Stephanie L.

    2016-01-01

    Spelling is an important skill that is crucial to effective written communication. In this study, brief experimental analysis procedures were used to examine spelling instruction strategies (e.g., whole word correction; word study strategy; positive practice; and cover, copy, and compare) for four students. In addition, an extended analysis was…

  13. Fourier transform infrared spectroscopy techniques for the analysis of drugs of abuse

    Science.gov (United States)

    Kalasinsky, Kathryn S.; Levine, Barry K.; Smith, Michael L.; Magluilo, Joseph J.; Schaefer, Teresa

    1994-01-01

    Cryogenic deposition techniques for Gas Chromatography/Fourier Transform Infrared (GC/FT-IR) can be successfully employed in urinalysis for drugs of abuse with detection limits comparable to those of the established Gas Chromatography/Mass Spectrometry (GC/MS) technique. The additional confidence of the data that infrared analysis can offer has been helpful in identifying ambiguous results, particularly, in the case of amphetamines where drugs of abuse can be confused with over-the-counter medications or naturally occurring amines. Hair analysis has been important in drug testing when adulteration of urine samples has been a question. Functional group mapping can further assist the analysis and track drug use versus time.

  14. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  15. Elimination and adsorptive transfer techniques in an oligonucleotide analysis

    Czech Academy of Sciences Publication Activity Database

    Jelen, František; Trnková, L.; Kouřilová, Alena; Kejnovská, Iva; Vorlíčková, Michaela

    Xi' an, 2009. P12. [International Symposium on Frontiers of Electrochemical Science and Technology . 12.08.2009-15.08.2009, Xi' an] R&D Projects: GA AV ČR(CZ) IAA400040804; GA AV ČR(CZ) IAA100040701; GA AV ČR(CZ) KAN200040651; GA MŠk(CZ) LC06035 Institutional research plan: CEZ:AV0Z50040507; CEZ:AV0Z50040702 Keywords : elimination voltammetry * transfer techniques * analysis of oligonucleotides Subject RIV: BO - Biophysics

  16. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  17. New technique for high-speed microjet breakup analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vago, N. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland); Spiegel, A. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Couty, P. [Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, Lausanne, BM, 1015, Lausanne (Switzerland); Wagner, F.R.; Richerzhagen, B. [Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland)

    2003-10-01

    In this paper we introduce a new technique for visualizing the breakup of thin high-speed liquid jets. Focused light of a He-Ne laser is coupled into a water jet, which behaves as a cylindrical waveguide until the point where the amplitude of surface waves is large enough to scatter out the light from the jet. Observing the jet from a direction perpendicular to its axis, the light that appears indicates the location of breakup. Real-time examination and also statistical analysis of the jet disruption is possible with this method. A ray tracing method was developed to demonstrate the light scattering process. (orig.)

  18. Acceleration of multivariate analysis techniques in TMVA using GPUs

    CERN Document Server

    Hoecker, A; Therhaag, J; Washbrook, A

    2012-01-01

    A feasibility study into the acceleration of multivariate analysis techniques using Graphics Processing Units (GPUs) will be presented. The MLP-based Artificial Neural Network method contained in the TMVA framework has been chosen as a focus for investigation. It was found that the network training time on a GPU was lower than for CPU execution as the complexity of the network was increased. In addition, multiple neural networks can be trained simultaneously on a GPU within the same time taken for single network training on a CPU. This could be potentially leveraged to provide a qualitative performance gain in data classification.

  19. Multi-element study in aluminium by activation analysis technique

    International Nuclear Information System (INIS)

    The instrumental activation analysis is a technique relatively quickly that help to know the elemental composition of materials. It is used mainly in the trace elements determination but in the case of major elements it is necessary to make some considerations as the different nuclear reactions carried out due to the neutron flux is a mixture of thermal and fast neutrons. This could be interpreted for the presence and or erroneous quantification about some elements. In this work, is described the way in which was analyzed a container piece with approximately a 85% of aluminium. The elements Zn, Mn, Sb, Ga, Cu, Cl and Sm were determined. (Author)

  20. Techniques for Improving Filters in Power Grid Contingency Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Adolf, Robert D.; Haglin, David J.; Halappanavar, Mahantesh; Chen, Yousu; Huang, Zhenyu

    2011-12-31

    In large-scale power transmission systems, predicting faults and preemptively taking corrective action to avoid them is essential to preventing rolling blackouts. The computational study of the constantly-shifting state of the power grid and its weaknesses is called contingency analysis. Multiple-contingency planning in the electrical grid is one example of a complex monitoring system where a full computational solution is operationally infeasible. We present a general framework for building and evaluating resource-aware models of filtering techniques for this type of monitoring.

  1. Determination of minor and trace elements concentration in kidney stones using elemental analysis techniques

    Science.gov (United States)

    Srivastava, Anjali

    The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. The X-ray fluorescence (XRF) and neutron activation analysis (NAA) experiments were performed and different kidney stones were analyzed. The interactions of X-ray photons and neutrons with matter are complementary in nature, resulting in distinctly different materials detection. This is the first approach to utilize combined X-ray fluorescence and neutron activation analysis for a comprehensive analysis of the kideny stones. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. The use of open source program Python Multi-Channel Analyzer was utilized to unfold the XRF spectrum. A new type of experimental set-up was developed and utilized for XRF and NAA analysis of the kidney stone. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF and NAA technique. The elements which were identified from XRF technique are Br, Cu, Ga, Ge, Mo, Nb, Ni, Rb, Se, Sr, Y, Zr. And, by using Neutron Activation Analysis (NAA) are Au, Br, Ca, Er, Hg, I, K, Na, Pm, Sb, Sc, Sm, Tb, Yb, Zn. This thesis presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF and NAA instrumental activation analysis techniques.

  2. Using Latent Class Analysis to Identify Academic and Behavioral Risk Status in Elementary Students

    Science.gov (United States)

    King, Kathleen R.; Lembke, Erica S.; Reinke, Wendy M.

    2016-01-01

    Identifying classes of children on the basis of academic and behavior risk may have important implications for the allocation of intervention resources within Response to Intervention (RTI) and Multi-Tiered System of Support (MTSS) models. Latent class analysis (LCA) was conducted with a sample of 517 third grade students. Fall screening scores in…

  3. Identifying Skill Requirements for GIS Positions: A Content Analysis of Job Advertisements

    Science.gov (United States)

    Hong, Jung Eun

    2016-01-01

    This study identifies the skill requirements for geographic information system (GIS) positions, including GIS analysts, programmers/developers/engineers, specialists, and technicians, through a content analysis of 946 GIS job advertisements from 2007-2014. The results indicated that GIS job applicants need to possess high levels of GIS analysis…

  4. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    DEFF Research Database (Denmark)

    Voight, Benjamin F; Scott, Laura J; Steinthorsdottir, Valgerdur;

    2010-01-01

    By combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals with combi...

  5. Identifying sustainability issues using participatory SWOT analysis - A case study of egg production in the Netherlands

    NARCIS (Netherlands)

    Mollenhorst, H.; Boer, de I.J.M.

    2004-01-01

    The aim of this paper was to demonstrate how participatory strengths, weaknesses, opportunities and threats (SWOT) analysis can be used to identify relevant economic, ecological and societal (EES) issues for the assessment of sustainable development. This is illustrated by the case of egg production

  6. Bioinformatics analysis identifies several intrinsically disordered human E3 ubiquitin-protein ligases

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Nielsen, Sofie V; Lindorff-Larsen, Kresten;

    2016-01-01

    conduct a bioinformatics analysis to examine >600 human and S. cerevisiae E3 ligases to identify enzymes that are similar to San1 in terms of function and/or mechanism of substrate recognition. An initial sequence-based database search was found to detect candidates primarily based on the homology of...

  7. Application of Genome-Wide Expression Analysis To Identify Molecular Markers Useful in Monitoring Industrial Fermentations

    OpenAIRE

    Higgins, Vincent J.; Rogers, Peter J.; Dawes, Ian W.

    2003-01-01

    Genome-wide expression analysis of an industrial strain of Saccharomyces cerevisiae identified the YOR387c and YGL258w homologues as highly inducible in zinc-depleted conditions. Induction was specific for zinc deficiency and was dependent on Zap1p. The results indicate that these sequences may be valuable molecular markers for detecting zinc deficiency in industrial fermentations.

  8. Genome-wide association scan meta-analysis identifies three loci influencing adiposity and fat distribution

    NARCIS (Netherlands)

    C.M. Lindgren (Cecilia); I.M. Heid (Iris); J.C. Randall (Joshua); C. Lamina (Claudia); V. Steinthorsdottir (Valgerdur); L. Qi (Lu); E.K. Speliotes (Elizabeth); G. Thorleifsson (Gudmar); C.J. Willer (Cristen); B.M. Herrera (Blanca); A.U. Jackson (Anne); N. Lim (Noha); P. Scheet (Paul); N. Soranzo (Nicole); N. Amin (Najaf); Y.S. Aulchenko (Yurii); J.C. Chambers (John); A. Drong (Alexander); J. Luan; H.N. Lyon (Helen); F. Rivadeneira Ramirez (Fernando); S. Sanna (Serena); N. Timpson (Nicholas); M.C. Zillikens (Carola); H.Z. Jing; P. Almgren (Peter); S. Bandinelli (Stefania); A.J. Bennett (Amanda); R.N. Bergman (Richard); L.L. Bonnycastle (Lori); S. Bumpstead (Suzannah); S.J. Chanock (Stephen); L. Cherkas (Lynn); P.S. Chines (Peter); L. Coin (Lachlan); C. Cooper (Charles); G. Crawford (Gabe); A. Doering (Angela); A. Dominiczak (Anna); A.S.F. Doney (Alex); S. Ebrahim (Shanil); P. Elliott (Paul); M.R. Erdos (Michael); K. Estrada Gil (Karol); L. Ferrucci (Luigi); G. Fischer (Guido); N.G. Forouhi (Nita); C. Gieger (Christian); H. Grallert (Harald); C.J. Groves (Christopher); S.M. Grundy (Scott); C. Guiducci (Candace); D. Hadley (David); A. Hamsten (Anders); A.S. Havulinna (Aki); A. Hofman (Albert); R. Holle (Rolf); J.W. Holloway (John); T. Illig (Thomas); B. Isomaa (Bo); L.C. Jacobs (Leonie); K. Jameson (Karen); P. Jousilahti (Pekka); F. Karpe (Fredrik); J. Kuusisto (Johanna); J. Laitinen (Jaana); G.M. Lathrop (Mark); D.A. Lawlor (Debbie); M. Mangino (Massimo); W.L. McArdle (Wendy); T. Meitinger (Thomas); M.A. Morken (Mario); A.P. Morris (Andrew); P. Munroe (Patricia); N. Narisu (Narisu); A. Nordström (Anna); B.A. Oostra (Ben); C.N.A. Palmer (Colin); F. Payne (Felicity); J. Peden (John); I. Prokopenko (Inga); F. Renström (Frida); A. Ruokonen (Aimo); V. Salomaa (Veikko); M.S. Sandhu (Manjinder); L.J. Scott (Laura); A. Scuteri (Angelo); K. Silander (Kaisa); K. Song (Kijoung); X. Yuan (Xin); H.M. Stringham (Heather); A.J. Swift (Amy); T. Tuomi (Tiinamaija); M. Uda (Manuela); P. Vollenweider (Peter); G. Waeber (Gérard); C. Wallace (Chris); G.B. Walters (Bragi); M.N. Weedon (Michael); J.C.M. Witteman (Jacqueline); C. Zhang (Cuilin); M. Caulfield (Mark); F.S. Collins (Francis); G.D. Smith; I.N.M. Day (Ian); P.W. Franks (Paul); A.T. Hattersley (Andrew); F.B. Hu (Frank); M.R. Jarvelin; A. Kong (Augustine); J.S. Kooner (Jaspal); M. Laakso (Markku); E. Lakatta (Edward); V. Mooser (Vincent); L. Peltonen (Leena Johanna); N.J. Samani (Nilesh); T.D. Spector (Timothy); D.P. Strachan (David); T. Tanaka (Toshiko); J. Tuomilehto (Jaakko); A.G. Uitterlinden (André); P. Tikka-Kleemola (Päivi); N.J. Wareham (Nick); H. Watkins (Hugh); D. Waterworth (Dawn); M. Boehnke (Michael); P. Deloukas (Panagiotis); L. Groop (Leif); D.J. Hunter (David); U. Thorsteinsdottir (Unnur); D. Schlessinger (David); H.E. Wichmann (Erich); T.M. Frayling (Timothy); G.R. Abecasis (Gonçalo); J.N. Hirschhorn (Joel); R.J.F. Loos (Ruth); J-A. Zwart (John-Anker); K.L. Mohlke (Karen); I. Barroso (Inês); M.I. McCarthy (Mark)

    2009-01-01

    textabstractTo identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist-hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evid

  9. Identifying Contingency Requirements using Obstacle Analysis on an Unpiloted Aerial Vehicle

    Science.gov (United States)

    Lutz, Robyn R.; Nelson, Stacy; Patterson-Hine, Ann; Frost, Chad R.; Tal, Doron

    2005-01-01

    This paper describes experience using Obstacle Analysis to identify contingency requirements on an unpiloted aerial vehicle. A contingency is an operational anomaly, and may or may not involve component failure. The challenges to this effort were: ( I ) rapid evolution of the system while operational, (2) incremental autonomy as capabilities were transferred from ground control to software control and (3) the eventual safety-criticality of such systems as they begin to fly over populated areas. The results reported here are preliminary but show that Obstacle Analysis helped (1) identify new contingencies that appeared as autonomy increased; (2) identify new alternatives for handling both previously known and new contingencies; and (3) investigate the continued validity of existing software requirements for contingency handling. Since many mobile, intelligent systems are built using a development process that poses the same challenges, the results appear to have applicability to other similar systems.

  10. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  11. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  12. Research fronts analysis : A bibliometric to identify emerging fields of research

    Science.gov (United States)

    Miwa, Sayaka; Ando, Satoko

    Research fronts analysis identifies emerging areas of research through observing co-clustering in highly-cited papers. This article introduces the concept of research fronts analysis, explains its methodology and provides case examples. It also demonstrates developing research fronts in Japan by looking at the past winners of Thomson Reuters Research Fronts Awards. Research front analysis is currently being used by the Japanese government to determine new trends in science and technology. Information professionals can also utilize this bibliometric as a research evaluation tool.

  13. An Application of Intelligent Data Analysis Techniques to a Large Software Engineering Dataset

    Science.gov (United States)

    Cain, James; Counsell, Steve; Swift, Stephen; Tucker, Allan

    Within the development of large software systems, there is significant value in being able to predict changes. If we can predict the likely changes that a system will undergo, then we can estimate likely developer effort and allocate resources appropriately. Within object oriented software development, these changes are often identified as refactorings. Very few studies have explored the prediction of refactorings on a wide-scale. Within this paper we aim to do just this, through applying intelligent data analysis techniques to a uniquely large and comprehensive software engineering time series dataset. Our analysis show extremely promising results, allowing us to predict the occurrence of future large changes.

  14. Application of the INAA technique for elemental analysis of metallic biomaterials used in dentistry

    Energy Technology Data Exchange (ETDEWEB)

    Cincu, Em [' Horia Hulubei' National Institute for Research and Development in Physics and Nuclear Engineering (IFIN-HH), Bucharest-Magurele, 407 Atomistilor Street, P. O. Box MG-6, Bucharest 077125 (Romania)], E-mail: cincue@nipne.ro; Craciun, L.; Manea-Grigore, Ioana; Cazan, I.L.; Manu, V. [' Horia Hulubei' National Institute for Research and Development in Physics and Nuclear Engineering (IFIN-HH), Bucharest-Magurele, 407 Atomistilor Street, P. O. Box MG-6, Bucharest 077125 (Romania); Barbos, D. [Institute for Nuclear Research (INR) Mioveni, 1Campului Street, P. O. Box 78, Bucharest 115400 (Romania); Cocis, A. [Dental Surgery Clinic PANA-DANIELA, Bucharest, 6 Intrarea Buzesti Street (Romania)

    2009-12-15

    The sensitive nuclear analytical technique Instrumental Neutron Activation Analysis (INAA) has been applied on several types of metallic biomaterials (Heraenium CE, Ventura Nibon, Wiron 99 and Ducinox which are currently used for restoration in the dental clinics) to study its performance in elemental analysis and identify eventual limitations. The investigation has been performed by two NAA Laboratories and aimed at getting an answer to the question on how the biomaterials compositions influence the patients' health over the course of time, taking into account the EC Directive 94/27/EC recommendations concerning Ni toxicity.

  15. Metabolites production improvement by identifying minimal genomes and essential genes using flux balance analysis.

    Science.gov (United States)

    Salleh, Abdul Hakim Mohamed; Mohamad, Mohd Saberi; Deris, Safaai; Illias, Rosli Md

    2015-01-01

    With the advancement in metabolic engineering technologies, reconstruction of the genome of host organisms to achieve desired phenotypes can be made. However, due to the complexity and size of the genome scale metabolic network, significant components tend to be invisible. We proposed an approach to improve metabolite production that consists of two steps. First, we find the essential genes and identify the minimal genome by a single gene deletion process using Flux Balance Analysis (FBA) and second by identifying the significant pathway for the metabolite production using gene expression data. A genome scale model of Saccharomyces cerevisiae for production of vanillin and acetate is used to test this approach. The result has shown the reliability of this approach to find essential genes, reduce genome size and identify production pathway that can further optimise the production yield. The identified genes and pathways can be extendable to other applications especially in strain optimisation. PMID:26489144

  16. New families of human regulatory RNA structures identified by comparative analysis of vertebrate genomes

    DEFF Research Database (Denmark)

    Parker, Brian John; Moltke, Ida; Roth, Adam;

    2011-01-01

    comparative method, EvoFam, for genome-wide identification of families of regulatory RNA structures, based on primary sequence and secondary structure similarity. We apply EvoFam to a 41-way genomic vertebrate alignment. Genome-wide, we identify 220 human, high-confidence families outside protein......-coding regions comprising 725 individual structures, including 48 families with known structural RNA elements. Known families identified include both noncoding RNAs, e.g., miRNAs and the recently identified MALAT1/MEN β lincRNA family; and cis-regulatory structures, e.g., iron-responsive elements. We also...... identify tens of new families supported by strong evolutionary evidence and other statistical evidence, such as GO term enrichments. For some of these, detailed analysis has led to the formulation of specific functional hypotheses. Examples include two hypothesized auto-regulatory feedback mechanisms: one...

  17. Validation of Design and Analysis Techniques of Tailored Composite Structures

    Science.gov (United States)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  18. Envelopment technique and topographic overlays in bite mark analysis

    Science.gov (United States)

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    Aims and Objectives: The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Materials and Methods: Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. Results: The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Conclusion: Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05). PMID:26816458

  19. Friction force microscopy: a simple technique for identifying graphene on rough substrates and mapping the orientation of graphene grains on copper

    OpenAIRE

    Marsden, Alexander J.; Phillips, Mick; Wilson, Neil R.

    2013-01-01

    At a single atom thick, it is challenging to distinguish graphene from its substrate using conventional techniques. In this paper we show that friction force microscopy (FFM) is a simple and quick technique for identifying graphene on a range of samples, from growth substrates to rough insulators. We show that FFM is particularly effective for characterising graphene grown on copper where it can correlate the graphene growth to the three-dimensional surface topography and map the crystallogra...

  20. Gene expression meta-analysis identifies metastatic pathways and transcription factors in breast cancer

    International Nuclear Information System (INIS)

    Metastasis is believed to progress in several steps including different pathways but the determination and understanding of these mechanisms is still fragmentary. Microarray analysis of gene expression patterns in breast tumors has been used to predict outcome in recent studies. Besides classification of outcome, these global expression patterns may reflect biological mechanisms involved in metastasis of breast cancer. Our purpose has been to investigate pathways and transcription factors involved in metastasis by use of gene expression data sets. We have analyzed 8 publicly available gene expression data sets. A global approach, 'gene set enrichment analysis' as well as an approach focusing on a subset of significantly differently regulated genes, GenMAPP, has been applied to rank pathway gene sets according to differential regulation in metastasizing tumors compared to non-metastasizing tumors. Meta-analysis has been used to determine overrepresentation of pathways and transcription factors targets, concordant deregulated in metastasizing breast tumors, in several data sets. The major findings are up-regulation of cell cycle pathways and a metabolic shift towards glucose metabolism reflected in several pathways in metastasizing tumors. Growth factor pathways seem to play dual roles; EGF and PDGF pathways are decreased, while VEGF and sex-hormone pathways are increased in tumors that metastasize. Furthermore, migration, proteasome, immune system, angiogenesis, DNA repair and several signal transduction pathways are associated to metastasis. Finally several transcription factors e.g. E2F, NFY, and YY1 are identified as being involved in metastasis. By pathway meta-analysis many biological mechanisms beyond major characteristics such as proliferation are identified. Transcription factor analysis identifies a number of key factors that support central pathways. Several previously proposed treatment targets are identified and several new pathways that may

  1. Neutron activation analysis techniques for identifying elemental status in Alzheimer's disease

    International Nuclear Information System (INIS)

    Brain tissue (hippocampus and cerebral cortex) from Alzheimer's disease and control individuals sampled from Eastern Canada and the United Kingdom were analyzed for Ag, Al, As, B, Br, Ca, Cd, Co, Cr, Cs, Cu, Fe, Hg, I, K, La, Mg, Mn, Mo, Ni, Rb, S, Sb, Sc, Se, Si, Sn, Sr, Ti, V and Zn. NAA (thermal and prompt gamma-ray) methods were used. Highly significant differences (probability less than 0.005) for both study areas were shown between Alzheimer's disease and control individuals. No statistical evidence of aluminium accumulation with age was noted. Possible zinc dificiency was observed. (author) 21 refs.; 5 tables

  2. A cross-species genetic analysis identifies candidate genes for mouse anxiety and human bipolar disorder

    Directory of Open Access Journals (Sweden)

    David G Ashbrook

    2015-07-01

    Full Text Available Bipolar disorder (BD is a significant neuropsychiatric disorder with a lifetime prevalence of ~1%. To identify genetic variants underlying BD genome-wide association studies (GWAS have been carried out. While many variants of small effect associated with BD have been identified few have yet been confirmed, partly because of the low power of GWAS due to multiple comparisons being made. Complementary mapping studies using murine models have identified genetic variants for behavioral traits linked to BD, often with high power, but these identified regions often contain too many genes for clear identification of candidate genes. In the current study we have aligned human BD GWAS results and mouse linkage studies to help define and evaluate candidate genes linked to BD, seeking to use the power of the mouse mapping with the precision of GWAS. We use quantitative trait mapping for open field test and elevated zero maze data in the largest mammalian model system, the BXD recombinant inbred mouse population, to identify genomic regions associated with these BD-like phenotypes. We then investigate these regions in whole genome data from the Psychiatric Genomics Consortium’s bipolar disorder GWAS to identify candidate genes associated with BD. Finally we establish the biological relevance and pathways of these genes in a comprehensive systems genetics analysis.We identify four genes associated with both mouse anxiety and human BD. While TNR is a novel candidate for BD, we can confirm previously suggested associations with CMYA5, MCTP1 and RXRG. A cross-species, systems genetics analysis shows that MCTP1, RXRG and TNR coexpress with genes linked to psychiatric disorders and identify the striatum as a potential site of action. CMYA5, MCTP1, RXRG and TNR are associated with mouse anxiety and human BD. We hypothesize that MCTP1, RXRG and TNR influence intercellular signaling in the striatum.

  3. A cross-species genetic analysis identifies candidate genes for mouse anxiety and human bipolar disorder.

    Science.gov (United States)

    Ashbrook, David G; Williams, Robert W; Lu, Lu; Hager, Reinmar

    2015-01-01

    Bipolar disorder (BD) is a significant neuropsychiatric disorder with a lifetime prevalence of ~1%. To identify genetic variants underlying BD genome-wide association studies (GWAS) have been carried out. While many variants of small effect associated with BD have been identified few have yet been confirmed, partly because of the low power of GWAS due to multiple comparisons being made. Complementary mapping studies using murine models have identified genetic variants for behavioral traits linked to BD, often with high power, but these identified regions often contain too many genes for clear identification of candidate genes. In the current study we have aligned human BD GWAS results and mouse linkage studies to help define and evaluate candidate genes linked to BD, seeking to use the power of the mouse mapping with the precision of GWAS. We use quantitative trait mapping for open field test and elevated zero maze data in the largest mammalian model system, the BXD recombinant inbred mouse population, to identify genomic regions associated with these BD-like phenotypes. We then investigate these regions in whole genome data from the Psychiatric Genomics Consortium's bipolar disorder GWAS to identify candidate genes associated with BD. Finally we establish the biological relevance and pathways of these genes in a comprehensive systems genetics analysis. We identify four genes associated with both mouse anxiety and human BD. While TNR is a novel candidate for BD, we can confirm previously suggested associations with CMYA5, MCTP1, and RXRG. A cross-species, systems genetics analysis shows that MCTP1, RXRG, and TNR coexpress with genes linked to psychiatric disorders and identify the striatum as a potential site of action. CMYA5, MCTP1, RXRG, and TNR are associated with mouse anxiety and human BD. We hypothesize that MCTP1, RXRG, and TNR influence intercellular signaling in the striatum. PMID:26190982

  4. Analysis of promoter regions of co-expressed genes identified by microarray analysis

    Directory of Open Access Journals (Sweden)

    Höglund Mattias

    2006-08-01

    Full Text Available Abstract Background The use of global gene expression profiling to identify sets of genes with similar expression patterns is rapidly becoming a widespread approach for understanding biological processes. A logical and systematic approach to study co-expressed genes is to analyze their promoter sequences to identify transcription factors that may be involved in establishing specific profiles and that may be experimentally investigated. Results We introduce promoter clustering i.e. grouping of promoters with respect to their high scoring motif content, and show that this approach greatly enhances the identification of common and significant transcription factor binding sites (TFBS in co-expressed genes. We apply this method to two different dataset, one consisting of micro array data from 108 leukemias (AMLs and a second from a time series experiment, and show that biologically relevant promoter patterns may be obtained using phylogenetic foot-printing methodology. In addition, we also found that 15% of the analyzed promoter regions contained transcription factors start sites for additional genes transcribed in the opposite direction. Conclusion Promoter clustering based on global promoter features greatly improve the identification of shared TFBS in co-expressed genes. We believe that the outlined approach may be a useful first step to identify transcription factors that contribute to specific features of gene expression profiles.

  5. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Directory of Open Access Journals (Sweden)

    Amin Torabipour

    2014-11-01

    Full Text Available This study aimed to measure the hospital productivity using data envelopment analysis (DEA technique and Malmquist indices.This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software.Six hospitals (50% had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05 (except in 2009 years.Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  6. Homogenization techniques for the analysis of porous SMA

    Science.gov (United States)

    Sepe, V.; Auricchio, F.; Marfia, S.; Sacco, E.

    2016-05-01

    In this paper the mechanical response of porous Shape Memory Alloy (SMA) is modeled. The porous SMA is considered as a composite medium made of a dense SMA matrix with voids treated as inclusions. The overall response of this very special composite is deduced performing a micromechanical and homogenization analysis. In particular, the incremental Mori-Tanaka averaging scheme is provided; then, the Transformation Field Analysis procedure in its uniform and nonuniform approaches, UTFA and NUTFA respectively, are presented. In particular, the extension of the NUTFA technique proposed by Sepe et al. (Int J Solids Struct 50:725-742, 2013) is presented to investigate the response of porous SMA characterized by closed and open porosity. A detailed comparison between the outcomes provided by the Mori-Tanaka, the UTFA and the proposed NUTFA procedures for porous SMA is presented, through numerical examples for two- and three-dimensional problems. In particular, several values of porosity and different loading conditions, inducing pseudoelastic effect in the SMA matrix, are investigated. The predictions assessed by the Mori-Tanaka, the UTFA and the NUTFA techniques are compared with the results obtained by nonlinear finite element analyses. A comparison with experimental data available in literature is also presented.

  7. Use of statistical techniques in analysis of biological data

    Directory of Open Access Journals (Sweden)

    Farzana Perveen

    2012-07-01

    Full Text Available Starting from the ancient age to the modern times not a single area can be found where statistics is not playing a vital role. Statistics has now been recognized and universally accepted as an essential component of research in every branch of science. Starting from agriculture, biology, education, economics, business, management, medical, engineering, psychology, environment and space, statistics is playing significant role. Statistics is being extensively used in biological sciences. Specifically, biostatistics is the branch of applied statistics that concerns the application of statistical methods to medical, genetics and biological problems. In the sequel, one important step is the appropriate and careful analysis of statistical data to get precise results. It is pertinent to mention that majority of statistical tests and techniques are applied under certain mathematical assumptions. Therefore, it is necessary to realize the importance of relevant assumptions. In this connection, among other assumptions, the assumption of normality (normal distribution of population(s and variance homogeneity etc. are the most important. If these assumptions are not satisfied, the results may be potentially misleading. It is, therefore, suggested to check the relevant assumption(s about the data before applying statistical test(s to get valid results. In this study, a few techniques/tests have been described for checking the normality of a given set of data. Since the Analysis of variance (ANOVA models are extensively used in biological research, therefore, the assumptions underlying the ANOVA have also been discussed. Non-parametric statistics is also described to some extent.

  8. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  9. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    Energy Technology Data Exchange (ETDEWEB)

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  10. Temporal analysis of elite men’s discus throwing technique.

    Directory of Open Access Journals (Sweden)

    Vassilios Panoutsakopoulos

    2013-02-01

    Full Text Available The purpose of this study was to investigate the relationship between the duration of the throw and the official throwing distance in a group of elite male discus throwers. The time analysis of the technique phases (i.e. preparation, entry, flight, transition, delivery, release of the participants in a top international athletics competition was used in order to conduct the study. Data were retrieved after recording seven right-handed throwers (age: 28.8 ± 4.1 years, body height: 1.94 ± 0.09 m, body mass: 119.4 ± 11.6 kg with a Casio EX-FX1 (Casio Computer Co. Ltd digital video camera (sampling frequency: 300fps and analyzing the captured throws with the V1 Home 2.02.54 software (Interactive Frontiers Inc.. The relationships among the duration of the technique phases of the throw and the official throwing distance were examined with Pearson Correlation Analysis using the SPSS 10.0.1 software (SPSS Inc.. Results revealed that no significant correlation (p > 0.05 existed among the average official throwing distance (63.04 ± 6.09 m and the duration of the discus throw or the duration of each technique phase. The temporal and correlation analyses were in agreement with previous studies. The dominant style of release was the release with no support on the ground. The majority of the throwers spent a larger percentage of the delivery turn (transition, delivery and release phases being in single than in double support. It was noted that a short duration of the transition phase, combined with lower values of the ratio of the time spent for the starting turn compared to the time spent for the delivery turn might be favorable regarding the achievement of a larger throwing distance.

  11. Independent component analysis of high-resolution imaging data identifies distinct functional domains

    DEFF Research Database (Denmark)

    Reidl, Juergen; Starke, Jens; Omer, David; Grinvald, Amiram; Spors, Hartwig

    2007-01-01

    . Here we demonstrate that principal component analysis (PCA) followed by spatial independent component analysis (sICA), can be exploited to reduce the dimensionality of data sets recorded in the olfactory bulb and the somatosensory cortex of mice as well as the visual cortex of monkeys, without loosing......In the vertebrate brain external stimuli are often represented in distinct functional domains distributed across the cortical surface. Fast imaging techniques used to measure patterns of population activity record movies with many pixels and many frames, i.e. data sets with high dimensionality...... automatically detected. In the visual cortex orientation columns can be extracted. In all cases artifacts due to movement, heartbeat or respiration were separated from the functional signal by sICA and could be removed from the data set. sICA is therefore a powerful technique for data compression, unbiased...

  12. Practice patterns in FNA technique: A survey analysis

    Institute of Scientific and Technical Information of China (English)

    Christopher; J; DiMaio; Jonathan; M; Buscaglia; Seth; A; Gross; Harry; R; Aslanian; Adam; J; Goodman; Sammy; Ho; Michelle; K; Kim; Shireen; Pais; Felice; Schnoll-Sussman; Amrita; Sethi; Uzma; D; Siddiqui; David; H; Robbins; Douglas; G; Adler; Satish; Nagula

    2014-01-01

    AIM: To ascertain fine needle aspiration(FNA) tech-niques by endosonographers with varying levels of ex-perience and environments.METHODS: A survey study was performed on United States based endosonographers. The subjects complet-ed an anonymous online electronic survey. The main outcome measurements were differences in needle choice, FNA technique, and clinical decision making among endosonographers and how this relates to years in practice, volume of EUS-FNA procedures, and prac-tice environment.RESULTS: A total of 210(30.8%) endosonographers completed the survey. Just over half(51.4%) identified themselves as academic/university-based practitioners. The vast majority of respondents(77.1%) identified themselves as high-volume endoscopic ultrasound(EUS)(> 150 EUS/year) and high-volume FNA(> 75 FNA/year) performers(73.3). If final cytology is non-diagnostic, high-volume EUS physicians were more likely than low volume physicians to repeat FNA with a core needle(60.5% vs 31.2%; P = 0.0004), and low volume physicians were more likely to refer patients for either surgical or percutaneous biopsy,(33.4% vs 4.9%, P < 0.0001). Academic physicians were more likely to repeat FNA with a core needle(66.7%) compared to community physicians(40.2%, P < 0.001). CONCLUSION: There is significant variation in EUS-FNA practices among United States endosonographers. Differences appear to be related to EUS volume and practice environment.

  13. Comparison of gas chromatographic hyphenated techniques for mercury speciation analysis.

    Science.gov (United States)

    Nevado, J J Berzas; Martín-Doimeadios, R C Rodríguez; Krupp, E M; Bernardo, F J Guzmán; Fariñas, N Rodríguez; Moreno, M Jiménez; Wallace, D; Ropero, M J Patiño

    2011-07-15

    In this study, we evaluate advantages and disadvantages of three hyphenated techniques for mercury speciation analysis in different sample matrices using gas chromatography (GC) with mass spectrometry (GC-MS), inductively coupled plasma mass spectrometry (GC-ICP-MS) and pyrolysis atomic fluorescence (GC-pyro-AFS) detection. Aqueous ethylation with NaBEt(4) was required in all cases. All systems were validated with respect to precision, with repeatability and reproducibility TMAH). No statistically significant differences were found to the certified values (p=0.05). The suitability for water samples analysis with different organic matter and chloride contents was evaluated by recovery experiments in synthetic spiked waters. Absolute detection and quantification limits were in the range of 2-6 pg for GC-pyro-AFS, 1-4 pg for GC-MS, with 0.05-0.21 pg for GC-ICP-MS showing the best limits of detection for the three systems employed. However, all systems are sufficiently sensitive for mercury speciation in environmental samples, with GC-MS and GC-ICP-MS offering isotope analysis capabilities for the use of species-specific isotope dilution analysis, and GC-pyro-AFS being the most cost effective alternative. PMID:21641604

  14. Evaluation of an intrinsic error estimator for the data fusion of NDT techniques used to identify the material and damage properties of concrete structures

    OpenAIRE

    D. Martini; Garnier, V.; PLOIX, M.A.

    2012-01-01

    In this paper, we propose an intrinsic error estimator for the data fusion of NDT techniques used to identify the material and damage properties of concrete structures. This error estimator is chosen based on the global distribution of the data fusion in the space of the identified material properties. The main idea is to evaluate the accuracy of the result in estimating the gap between the most and the worst likely solutions. This error estimator is applied to synthetic data depending on the...

  15. Sensitivity of a Multi-view Bremsstrahlung Scanning Technique for Radiography of Intermodal Cargo Containers to Identify Fisssionable Materials

    OpenAIRE

    Hansen, Araina

    2014-01-01

    Rapid scanning of intermodal cargo containers for clandestine fissionable material is a security concern of the United States government. This issue has been investigated over the past ten years with limited success. This dissertation builds on previous simulation work to investigate the limits of a multi-view scanning technique to determine the linear attenuation coefficients of small high-Z objects hidden within cargo containers. Monte Carlo modeling was used to estimate the uncertainties i...

  16. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample. PMID:27030469

  17. Probabilistic approach to identify sensitive parameter distributions in multimedia pathway analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kamboj, S.; Gnanapragasam, E.; LePoire, D.; Biwer, B. M.; Cheng, J.; Arnish, J.; Yu, C.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Environmental Assessment; NRC

    2002-01-01

    Sensitive parameter distributions were identified with the use of probabilistic analysis in the RESRAD computer code. RESRAD is a multimedia pathway analysis code designed to evaluate radiological exposures resulting from radiological contamination in soil. The dose distribution was obtained by using a set of default parameter distribution/values. Most of the variations in the output dose distribution could be attributed to uncertainty in a small set of input parameters that could be considered as sensitive parameter distributions. The identification of the sensitive parameters is a first step in the prioritization of future research and information gathering. When site-specific parameter distribution/values are available for an actual site, the same process should be used with these site-specific data. Regression analysis used to identify sensitive parameters indicated that the dominant pathways depended on the radionuclide and source configurations. However, two parameter distributions were sensitive for many radionuclides: the external shielding factor when external exposure was the dominant pathway and the plant transfer factor when plant ingestion was the dominant pathway. No single correlation or regression coefficient can be used alone to identify sensitive parameters in all the cases. The coefficients are useful guides, but they have to be used in conjunction with other aids, such as scatter plots, and should undergo further analysis.

  18. Use of nuclear techniques for coal analysis in exploration, mining and processing

    International Nuclear Information System (INIS)

    Nuclear techniques have a long history of application in the coal industry, during exploration and especially during coal preparation, for the measurement of ash content. The preferred techniques are based on X- and gamma-ray scattering and borehole logging, and on-line equipment incorporating these techniques are now in world-wide routine use. However, gamma-ray techniques are mainly restricted to density measurement and X-ray techniques are principally used for ash determinations. They have a limited range and when used on-line some size reduction of the coal is usually required and a full elemental analysis is not possible. In particular, X- and gamma-ray techniques are insensitive to the principal elements in the combustible component and to many of the important elements in the mineral fraction. Neutron techniques on the other hand have a range which is compatible with on-line requirements and all elements in the combustible component and virtually all elements in the mineral component can be observed. A complete elemental analysis of coal then allows the ash content and the calorific value to be determined on-line. This paper surveys the various nuclear techniques now in use and gives particular attention to the present state of development of neutron methods and to their advantages and limitations. Although it is shown that considerable further development and operational experience are still required, equipment now being introduced has a performance which matches many of the identified requirements and an early improvement in specification can be anticipated

  19. Identifying the Drivers and Occurrence of Historical and Future Extreme Air-quality Events in the United States Using Advanced Statistical Techniques

    Science.gov (United States)

    Porter, W. C.; Heald, C. L.; Cooley, D. S.; Russell, B. T.

    2013-12-01

    Episodes of air-quality extremes are known to be heavily influenced by meteorological conditions, but traditional statistical analysis techniques focused on means and standard deviations may not capture important relationships at the tails of these two respective distributions. Using quantile regression (QR) and extreme value theory (EVT), methodologies specifically developed to examine the behavior of heavy-tailed phenomena, we analyze extremes in the multi-decadal record of ozone (O3) and fine particulate matter (PM2.5) in the United States. We investigate observations from the Air Quality System (AQS) and Interagency Monitoring of Protected Visual Environments (IMPROVE) networks for connections to meteorological drivers, as provided by the National Center for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) product. Through regional characterization by quantile behavior and EVT modeling of the meteorological covariates most responsible for extreme levels of O3 and PM2.5, we estimate pollutant exceedance frequencies and uncertainties in the United States under current and projected future climates, highlighting those meteorological covariates and interactions whose influence on air-quality extremes differs most significantly from the behavior of the bulk of the distribution. As current policy may be influenced by air-quality projections, we then compare these estimated frequencies to those produced by NCAR's Community Earth System Model (CESM) identifying regions, covariates, and species whose extreme behavior may not be adequately captured by current models.

  20. Analysis of Protein in Soybean by Neutron Activation Technique

    International Nuclear Information System (INIS)

    Nitrogen content in soybean was studied by using Neutron Activation Analysis technique through fast neutron at the flux of 2.5 * 1011 n/cm2. sec in the CA-3 out-core irradiation tube of the Thai Research Reactor-1/Modification 1 (TRR-1/M1, Triga Mark 3 type). By measuring gamma ray of 511 keV from 13N of the nuclear reaction, 14N(n, 2n)13N caused by the annihilation of positron disintegrated, the semi-conductor detector (HPGe) was connected with the multi-channel analyzer (MCA) and monitor to display the spectrum range. NH4NO3 was used as the standard for the analysis. The inaccuracy of the analysis caused by other radioisotopes, i.e. potassium, phosphorus and reaction from recoiled proton scattering in soybean was corrected. The data of 27 samples analyzed by neutron activation showed no significant difference in the nitrogen content. The average nitrogen content of all the soybean samples is 7.02% equivalent to protein content of 43.88%

  1. Leadership Styles and Management Techniques: An Analysis of Malaysian Women Entrepreneurs

    Directory of Open Access Journals (Sweden)

    Ayanty Kuppusamy

    2010-02-01

    Full Text Available Leaders of organizations need to lead and manage effectively to succeed. Thus, women entrepreneurs who are becoming more prominent in the business arena, have to be good leaders and managers. This study aims to identify the relationship of leadership styles which are charismatic and transformational and management techniques with organizational performance of women entrepreneurs in Malaysia. Questionnaires were sent to women entrepreneurs registered under NAWEM (National Association of Women Entrepreneurs of Malaysia. Correlation analysis and regression analysis were used to test the data. The result showed although both charismatic and transformation leadership and management techniques are utilized by the women entrepreneurs, however, the significant predictor of organizational performance is charismatic leadership.

  2. Meta-Analysis of Placental Transcriptome Data Identifies a Novel Molecular Pathway Related to Preeclampsia.

    Science.gov (United States)

    van Uitert, Miranda; Moerland, Perry D; Enquobahrie, Daniel A; Laivuori, Hannele; van der Post, Joris A M; Ris-Stalpers, Carrie; Afink, Gijs B

    2015-01-01

    Studies using the placental transcriptome to identify key molecules relevant for preeclampsia are hampered by a relatively small sample size. In addition, they use a variety of bioinformatics and statistical methods, making comparison of findings challenging. To generate a more robust preeclampsia gene expression signature, we performed a meta-analysis on the original data of 11 placenta RNA microarray experiments, representing 139 normotensive and 116 preeclamptic pregnancies. Microarray data were pre-processed and analyzed using standardized bioinformatics and statistical procedures and the effect sizes were combined using an inverse-variance random-effects model. Interactions between genes in the resulting gene expression signature were identified by pathway analysis (Ingenuity Pathway Analysis, Gene Set Enrichment Analysis, Graphite) and protein-protein associations (STRING). This approach has resulted in a comprehensive list of differentially expressed genes that led to a 388-gene meta-signature of preeclamptic placenta. Pathway analysis highlights the involvement of the previously identified hypoxia/HIF1A pathway in the establishment of the preeclamptic gene expression profile, while analysis of protein interaction networks indicates CREBBP/EP300 as a novel element central to the preeclamptic placental transcriptome. In addition, there is an apparent high incidence of preeclampsia in women carrying a child with a mutation in CREBBP/EP300 (Rubinstein-Taybi Syndrome). The 388-gene preeclampsia meta-signature offers a vital starting point for further studies into the relevance of these genes (in particular CREBBP/EP300) and their concomitant pathways as biomarkers or functional molecules in preeclampsia. This will result in a better understanding of the molecular basis of this disease and opens up the opportunity to develop rational therapies targeting the placental dysfunction causal to preeclampsia. PMID:26171964

  3. Meta-Analysis of Placental Transcriptome Data Identifies a Novel Molecular Pathway Related to Preeclampsia.

    Directory of Open Access Journals (Sweden)

    Miranda van Uitert

    Full Text Available Studies using the placental transcriptome to identify key molecules relevant for preeclampsia are hampered by a relatively small sample size. In addition, they use a variety of bioinformatics and statistical methods, making comparison of findings challenging. To generate a more robust preeclampsia gene expression signature, we performed a meta-analysis on the original data of 11 placenta RNA microarray experiments, representing 139 normotensive and 116 preeclamptic pregnancies. Microarray data were pre-processed and analyzed using standardized bioinformatics and statistical procedures and the effect sizes were combined using an inverse-variance random-effects model. Interactions between genes in the resulting gene expression signature were identified by pathway analysis (Ingenuity Pathway Analysis, Gene Set Enrichment Analysis, Graphite and protein-protein associations (STRING. This approach has resulted in a comprehensive list of differentially expressed genes that led to a 388-gene meta-signature of preeclamptic placenta. Pathway analysis highlights the involvement of the previously identified hypoxia/HIF1A pathway in the establishment of the preeclamptic gene expression profile, while analysis of protein interaction networks indicates CREBBP/EP300 as a novel element central to the preeclamptic placental transcriptome. In addition, there is an apparent high incidence of preeclampsia in women carrying a child with a mutation in CREBBP/EP300 (Rubinstein-Taybi Syndrome. The 388-gene preeclampsia meta-signature offers a vital starting point for further studies into the relevance of these genes (in particular CREBBP/EP300 and their concomitant pathways as biomarkers or functional molecules in preeclampsia. This will result in a better understanding of the molecular basis of this disease and opens up the opportunity to develop rational therapies targeting the placental dysfunction causal to preeclampsia.

  4. Transcriptome analysis of recurrently deregulated genes across multiple cancers identifies new pan-cancer biomarkers

    DEFF Research Database (Denmark)

    Kaczkowski, Bogumil; Tanaka, Yuji; Kawaji, Hideya; Sandelin, Albin; Andersson, Robin; Itoh, Masayoshi; Lassmann, Timo; Hayashizaki, Yoshihide; Carninci, Piero; Forrest, Alistair R

    2015-01-01

    Genes that are commonly deregulated in cancer are clinically attractive as candidate pan-diagnostic markers and therapeutic targets. To globally identify such targets, we compared Cap Analysis of Gene Expression (CAGE) profiles from 225 different cancer cell lines and 339 corresponding primary cell...... samples to identify transcripts that are deregulated recurrently in a broad range of cancer types. Comparing RNA-seq data from 4,055 tumors and 563 normal tissues profiled in the TCGA and FANTOM5 datasets, we identified a core transcript set with theranostic potential. Our analyses also revealed enhancer...... RNAs which are upregulated in cancer, defining promoters which overlap with repetitive elements (especially SINE/Alu and LTR/ERV1 elements) that are often upregulated in cancer. Lastly, we documented for the first time upregulation of multiple copies of the REP522 interspersed repeat in cancer. Overall...

  5. Identifying Population Groups with Low Palliative Care Program Enrolment Using Classification and Regression Tree Analysis

    Science.gov (United States)

    Gao, Jun; Lavergne, M. Ruth; McIntyre, Paul

    2013-01-01

    Classification and regression tree (CART) analysis was used to identify subpopulations with lower palliative care program (PCP) enrolment rates. CART analysis uses recursive partitioning to group predictors. The PCP enrolment rate was 72 percent for the 6,892 adults who died of cancer from 2000 and 2005 in two counties in Nova Scotia, Canada. The lowest PCP enrolment rates were for nursing home residents over 82 years (27 percent), a group residing more than 43 kilometres from the PCP (31 percent), and another group living less than two weeks after their cancer diagnosis (37 percent). The highest rate (86 percent) was for the 2,118 persons who received palliative radiation. Findings from multiple logistic regression (MLR) were provided for comparison. CART findings identified low PCP enrolment subpopulations that were defined by interactions among demographic, social, medical, and health system predictors. PMID:21805944

  6. Analysis of regulatory protease sequences identified through bioinformatic data mining of the Schistosoma mansoni genome

    Directory of Open Access Journals (Sweden)

    Minchella Dennis J

    2009-10-01

    Full Text Available Abstract Background New chemotherapeutic agents against Schistosoma mansoni, an etiological agent of human schistosomiasis, are a priority due to the emerging drug resistance and the inability of current drug treatments to prevent reinfection. Proteases have been under scrutiny as targets of immunological or chemotherapeutic anti-Schistosoma agents because of their vital role in many stages of the parasitic life cycle. Function has been established for only a handful of identified S. mansoni proteases, and the vast majority of these are the digestive proteases; very few of the conserved classes of regulatory proteases have been identified from Schistosoma species, despite their vital role in numerous cellular processes. To that end, we identified protease protein coding genes from the S. mansoni genome project and EST library. Results We identified 255 protease sequences from five catalytic classes using predicted proteins of the S. mansoni genome. The vast majority of these show significant similarity to proteins in KEGG and the Conserved Domain Database. Proteases include calpains, caspases, cytosolic and mitochondrial signal peptidases, proteases that interact with ubiquitin and ubiquitin-like molecules, and proteases that perform regulated intramembrane proteolysis. Comparative analysis of classes of important regulatory proteases find conserved active site domains, and where appropriate, signal peptides and transmembrane helices. Phylogenetic analysis provides support for inferring functional divergence among regulatory aspartic, cysteine, and serine proteases. Conclusion Numerous proteases are identified for the first time in S. mansoni. We characterized important regulatory proteases and focus analysis on these proteases to complement the growing knowledge base of digestive proteases. This work provides a foundation for expanding knowledge of proteases in Schistosoma species and examining their diverse function and potential as targets

  7. Techniques of production and analysis of polarized synchrotron radiation

    International Nuclear Information System (INIS)

    The use of the unique polarization properties of synchrotron radiation in the hard x-ray spectral region (E>3 KeV) is becoming increasingly important to many synchrotron radiation researchers. The radiation emitted from bending magnets and conventional (planar) insertion devices (IDs) is highly linearly polarized in the plane of the particle's orbit. Elliptically polarized x-rays can also be obtained by going off axis on a bending magnet source, albeit with considerable loss of flux. The polarization properties of synchrotron radiation can be further tailored to the researcher's specific needs through the use of specialized insertion devices such as helical and crossed undulators and asymmetrical wigglers. Even with the possibility of producing a specific polarization, there is still the need to develop x-ray optical components which can manipulate the polarization for both analysis and further modification of the polarization state. A survey of techniques for producing and analyzing both linear and circular polarized x-rays will be presented with emphasis on those techniques which rely on single crystal optical components

  8. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  9. Emergent team roles in organizational meetings: Identifying communication patterns via cluster analysis.

    OpenAIRE

    Lehmann-Willenbrock, N.K.; Beck, S.J.; Kauffeld, S.

    2016-01-01

    Previous team role taxonomies have largely relied on self-report data, focused on functional roles, and described individual predispositions or personality traits. Instead, this study takes a communicative approach and proposes that team roles are produced, shaped, and sustained in communicative behaviors. To identify team roles communicatively, 59 regular organizational meetings were videotaped and analyzed. Cluster analysis revealed five emergent roles: the solution seeker, the problem anal...

  10. Identifying Gender-Preferred Communication Styles within Online Cancer Communities: A Retrospective, Longitudinal Analysis

    OpenAIRE

    Durant, Kathleen T.; McCray, Alexa T.; Charles Safran

    2012-01-01

    BACKGROUND: The goal of this research is to determine if different gender-preferred social styles can be observed within the user interactions at an online cancer community. To achieve this goal, we identify and measure variables that pertain to each gender-specific social style. METHODS AND FINDINGS: We perform social network and statistical analysis on the communication flow of 8,388 members at six different cancer forums over eight years. Kruskal-Wallis tests were conducted to measure the ...

  11. Network analysis identifies protein clusters of functional importance in juvenile idiopathic arthritis

    OpenAIRE

    Stevens, Adam; Meyer, Stefan; Hanson, Daniel; Clayton, Peter; Donn, Rachelle

    2014-01-01

    Introduction Our objective was to utilise network analysis to identify protein clusters of greatest potential functional relevance in the pathogenesis of oligoarticular and rheumatoid factor negative (RF-ve) polyarticular juvenile idiopathic arthritis (JIA). Methods JIA genetic association data were used to build an interactome network model in BioGRID 3.2.99. The top 10% of this protein:protein JIA Interactome was used to generate a minimal essential network (MEN). Reactome FI Cytoscape 2.83...

  12. Integrative Omics Analysis of Rheumatoid Arthritis Identifies Non-Obvious Therapeutic Targets

    OpenAIRE

    Whitaker, John W.; Boyle, David L.; Bartok, Beatrix; Ball, Scott T.; Gay, Steffen; Wang, Wei; Firestein, Gary S.

    2015-01-01

    Identifying novel therapeutic targets for the treatment of disease is challenging. To this end, we developed a genome-wide approach of candidate gene prioritization. We independently collocated sets of genes that were implicated in rheumatoid arthritis (RA) pathogenicity through three genome-wide assays: (i) genome-wide association studies (GWAS), (ii) differentially expression in RA fibroblast-like synoviocytes (FLS), and (iii) differentially methylation in RA FLS. Integrated analysis of the...

  13. Integrative omics analysis of rheumatoid arthritis identifies non-obvious therapeutic targets

    OpenAIRE

    Whitaker, John W.; Boyle, David L.; Bartok, Beatrix; Ball, Scott T.; Gay, Steffen; Wang, Wei; Firestein, Gary S.

    2015-01-01

    Identifying novel therapeutic targets for the treatment of disease is challenging. To this end, we developed a genome-wide approach of candidate gene prioritization. We independently collocated sets of genes that were implicated in rheumatoid arthritis (RA) pathogenicity through three genome-wide assays: (i) genome-wide association studies (GWAS), (ii) differentially expression in RA fibroblast-like synoviocytes (FLS), and (iii) differentially methylation in RA FLS. Integrated analysis of the...

  14. Robust Microarray Meta-Analysis Identifies Differentially Expressed Genes for Clinical Prediction

    OpenAIRE

    Phan, John H.; Andrew N. Young; Wang, May D.

    2012-01-01

    Combining multiple microarray datasets increases sample size and leads to improved reproducibility in identification of informative genes and subsequent clinical prediction. Although microarrays have increased the rate of genomic data collection, sample size is still a major issue when identifying informative genetic biomarkers. Because of this, feature selection methods often suffer from false discoveries, resulting in poorly performing predictive models. We develop a simple meta-analysis-ba...

  15. Identifying patterns in treatment response profiles in acute bipolar mania: a cluster analysis approach

    OpenAIRE

    Houston John P; Lipkovich Ilya A; Ahl Jonna

    2008-01-01

    Abstract Background Patients with acute mania respond differentially to treatment and, in many cases, fail to obtain or sustain symptom remission. The objective of this exploratory analysis was to characterize response in bipolar disorder by identifying groups of patients with similar manic symptom response profiles. Methods Patients (n = 222) were selected from a randomized, double-blind study of treatment with olanzapine or divalproex in bipolar I disorder, manic or mixed episode, with or w...

  16. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  17. Image analysis technique applied to lock-exchange gravity currents

    Science.gov (United States)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  18. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  19. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  20. Expert rowers’ motion analysis for synthesis and technique digitalization

    Directory of Open Access Journals (Sweden)

    Filippeschi Alessandro

    2011-12-01

    Full Text Available Four expert rowers’ gestures were gathered on the SPRINT rowing platform with the aid of an optic motion tracking system. Data were analyzed in order to get a digital representation of the features involved in rowing. Moreover, these data provide a dataset for developing digital models for rowing motion synthesis. Rowers were modeled as kinematic chains, data were processed in order to get position and orientation of upper body limbs. This representation was combined with SPRINT data in order to evaluate features found in the literature, to find new ones and to build models for the generation of rowing motion. The analysis shows the effectiveness of the motion reconstruction and two examples of technique features: stroke timing and upper limbs orientation during the finish phase.

  1. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  2. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  3. Application of different techniques to identify the effects of irradiation on Brazilian beans after six months storage

    Energy Technology Data Exchange (ETDEWEB)

    Villavicencio, A.L.C.H.; Mancini-Filho, J.; Delincee, H

    1998-06-01

    Four different techniques to detect the effect of irradiation in beans were investigated. Two types of Brazilian beans, Phaseolus vulgaris L., var. carioca and Vigna unguiculata (L.) Walp, var. macacar, were irradiated using a {sup 60}Co source with doses ranging from 0, 1.0 to 10.0 kGy. After 6 months storage at ambient temperature the detection tests were carried out. Firstly, germination tests showed markedly reduced root growth and almost totally retarded shoot elongation of irradiated beans as compared to non-irradiated beans. Secondly, DNA fragmentation was studied using a microgel electrophoresis. Irradiated cells produced typical comets with DNA fragments migrating towards the anode. DNA of non-irradiated cells exhibited a limited migration. Thirdly, electron spin resonance for detection of cellulose radicals was tested, since it was expected that these free radicals are quite stable in solid and dry foods. However, only in beans irradiated with 10 kGy a small signal could be detected. Fourthly, thermoluminescence, a method to analyze mineral debris adhering to food, turned out to be a good choice to detect irradiation effects in beans, even after 6 months of storage. The results indicate that three of these four techniques proposed, can be used to detect the effect of irradiation in these two varieties of Brazilian beans at a dose level useful for insect disinfestation (1 kGy)

  4. Application of different techniques to identify the effects of irradiation on Brazilian beans after six months storage

    Science.gov (United States)

    Villavicencio, A. L. C. H.; Mancini-Filho, J.; Delincée, H.

    1998-06-01

    Four different techniques to detect the effect of irradiation in beans were investigated. Two types of Brazilian beans, Phaseolus vulgaris L., var. carioca and Vigna unguiculata (L.) Walp, var. macaçar, were irradiated using a 60Co source with doses ranging from 10.0 kGy. After 6 months storage at ambient temperature the detection tests were carried out. Firstly, germination tests showed markedly reduced root growth and almost totally retarded shoot elongation of irradiated beans as compared to non-irradiated beans. Secondly, DNA fragmentation was studied using a microgel electrophoresis. Irradiated cells produced typical comets with DNA fragments migrating towards the anode. DNA of non-irradiated cells exhibited a limited migration. Thirdly, electron spin resonance for detection of cellulose radicals was tested, since it was expected that these free radicals are quite stable in solid and dry foods. However, only in beans irradiated with 10 kGy a small signal could be detected. Fourtly, thermoluminescence, a method to analyze mineral debris adhering to food, turned out to be a good choice to detect irradiation effects in beans, even after 6 months of storage. The results indicate that three of these four techniques proposed, can be used to detect the effect of irradiation in these two varieties of Brazilian beans at a dose level useful for insect disinfestation (1 kGy).

  5. Hot spot analysis applied to identify ecosystem services potential in Lithuania

    Science.gov (United States)

    Pereira, Paulo; Depellegrin, Daniel; Misiune, Ieva

    2016-04-01

    Hot spot analysis are very useful to identify areas with similar characteristics. This is important for a sustainable use of the territory, since we can identify areas that need to be protected, or restored. This is a great advantage in terms of land use planning and management, since we can allocate resources, reduce the economical costs and do a better intervention in the landscape. Ecosystem services (ES) are different according land use. Since landscape is very heterogeneous, it is of major importance understand their spatial pattern and where are located the areas that provide better ES and the others that provide less services. The objective of this work is to use hot-spot analysis to identify areas with the most valuable ES in Lithuania. CORINE land-cover (CLC) of 2006 was used as the main spatial information. This classification uses a grid of 100 m resolution and extracted a total of 31 land use types. ES ranking was carried out based on expert knowledge. They were asked to evaluate the ES potential of each different CLC from 0 (no potential) to 5 (very high potential). Hot spot analysis were evaluated using the Getis-ord test, which identifies cluster analysis available in ArcGIS toolbox. This tool identifies areas with significantly high low values and significant high values at a p level of 0.05. In this work we used hot spot analysis to assess the distribution of providing, regulating cultural and total (sum of the previous 3) ES. The Z value calculated from Getis-ord was used to statistical analysis to access the clusters of providing, regulating cultural and total ES. ES with high Z value show that they have a high number of cluster areas with high potential of ES. The results showed that the Z-score was significantly different among services (Kruskal Wallis ANOVA =834. 607, pcultural (0.080±1.979) and regulating (0.076±1.961). These results suggested that providing services are more clustered than the remaining. Ecosystem Services Z score were

  6. Combination of meta-analysis and graph clustering to identify prognostic markers of ESCC

    Directory of Open Access Journals (Sweden)

    Hongyun Gao

    2012-01-01

    Full Text Available Esophageal squamous cell carcinoma (ESCC is one of the most malignant gastrointestinal cancers and occurs at a high frequency rate in China and other Asian countries. Recently, several molecular markers were identified for predicting ESCC. Notwithstanding, additional prognostic markers, with a clear understanding of their underlying roles, are still required. Through bioinformatics, a graph-clustering method by DPClus was used to detect co-expressed modules. The aim was to identify a set of discriminating genes that could be used for predicting ESCC through graph-clustering and GO-term analysis. The results showed that CXCL12, CYP2C9, TGM3, MAL, S100A9, EMP-1 and SPRR3 were highly associated with ESCC development. In our study, all their predicted roles were in line with previous reports, whereby the assumption that a combination of meta-analysis, graph-clustering and GO-term analysis is effective for both identifying differentially expressed genes, and reflecting on their functions in ESCC.

  7. Identifying Critical Factors of Sale Failure on Commercial Property Types, Shop Houses by Using Multi Attribute Variable Technique

    OpenAIRE

    N.I. Mohamad; N. M. Tawil; I.M. Usman; M. M. Tahir

    2014-01-01

    The focus of this research is to identify the critical factors of shop houses sale failure in Bandar Baru Nilai and further up to discover the critical factors of sale failure of commercial property types, shop houses in new township as report by valuation and Property services department (JPPH) showed 5,931 units of shop houses in Malaysia is currently completed but remained unsold where Johor was recorded as the highest with unsold units followed by Negeri Sembilan. Bandar Baru Nilai (a dis...

  8. A predictive cognitive error analysis technique for emergency tasks

    International Nuclear Information System (INIS)

    This paper introduces an analysis framework and procedure for the support of cognitive error analysis of emergency tasks in nuclear power plants. The framework provides a new perspective in the utilization of error factors into error prediction. The framework can be characterized by two features. First, error factors that affect the occurrence of human error are classified into three groups, 'task characteristics factors (TCF)', 'situation factors (SF)', and 'performance assisting factors (PAF)', and are utilized in the error prediction. This classification aims to support error prediction from the viewpoint of assessing the adequacy of PAF under given TCF and SF. Second, the assessment of error factors is made in the perspective of the performance of each cognitive function. Through this, error factors assessment is made in an integrative way not independently. Furthermore, it enables analysts to identify vulnerable cognitive functions and error factors, and to obtain specific error reduction strategies. Finally, the framework and procedure was applied to the error analysis of the 'bleed and feed operation' of emergency tasks

  9. Evaluation of geophysical techniques for identifying fractures in program wells in Deaf Smith County, Texas: Revision 1, Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, R.P.; Siminitz, P.C.

    1987-08-01

    Quantitative information about the presence and orientation of fractures is essential for the understanding of the geomechanical and geohydrological behavior of rocks. This report evaluates various borehole geophysical techniques for characterizing fractures in three Civilian Radioactive Waste Management (CRWM) Program test wells in the Palo Duro Basin in Deaf Smith County, Texas. Emphasis has been placed on the Schlumberger Fracture Identification Log (FIL) which detects vertical fractures and provides data for calculation of orientation. Depths of FIL anomalies were compared to available core. It was found that the application of FIL results to characterize fracture frequency or orientation is inappropriate at this time. The uncertainties associated with the FIL information render the information unreliable. No geophysical logging tool appears to unequivocally determine the location and orientation of fractures in a borehole. Geologic mapping of the exploratory shafts will ultimately provide the best data on fracture frequency and orientation at the proposed repository site. 22 refs., 6 figs., 3 tabs.

  10. A portable system for identifying urinary tract infection in primary care using a PC-based chromatic technique

    International Nuclear Information System (INIS)

    An approach is described for monitoring urine samples using a portable system based on chromatic techniques and for predicting urinary tract infection (UTI) from the results. The system uses a webcam–computer combination with the screen of a computer visual display unit as a tuneable illumination source. It is shown that the system can operate in a robust manner under ambient lighting conditions and with potential for use as a point of care test in primary care. The present approach combines information on urine liquid concentration and turbidity. Its performance in an exploratory study is compared with microbiological culture of 200 urine samples, of which 79 had bacterial growth >105 colony forming unit/millilitre (cfu ml−1) indicative of UTI. It is shown that both sensitivity and negative predictive value of 0.92 could be achieved. (paper)

  11. Evaluation of geophysical techniques for identifying fractures in program wells in Deaf Smith County, Texas: Revision 1, Topical report

    International Nuclear Information System (INIS)

    Quantitative information about the presence and orientation of fractures is essential for the understanding of the geomechanical and geohydrological behavior of rocks. This report evaluates various borehole geophysical techniques for characterizing fractures in three Civilian Radioactive Waste Management (CRWM) Program test wells in the Palo Duro Basin in Deaf Smith County, Texas. Emphasis has been placed on the Schlumberger Fracture Identification Log (FIL) which detects vertical fractures and provides data for calculation of orientation. Depths of FIL anomalies were compared to available core. It was found that the application of FIL results to characterize fracture frequency or orientation is inappropriate at this time. The uncertainties associated with the FIL information render the information unreliable. No geophysical logging tool appears to unequivocally determine the location and orientation of fractures in a borehole. Geologic mapping of the exploratory shafts will ultimately provide the best data on fracture frequency and orientation at the proposed repository site. 22 refs., 6 figs., 3 tabs

  12. Identifying Critical Factors of Sale Failure on Commercial Property Types, Shop Houses by Using Multi Attribute Variable Technique

    Directory of Open Access Journals (Sweden)

    N.I. Mohamad

    2014-04-01

    Full Text Available The focus of this research is to identify the critical factors of shop houses sale failure in Bandar Baru Nilai and further up to discover the critical factors of sale failure of commercial property types, shop houses in new township as report by valuation and Property services department (JPPH showed 5,931 units of shop houses in Malaysia is currently completed but remained unsold where Johor was recorded as the highest with unsold units followed by Negeri Sembilan. Bandar Baru Nilai (a district of Negeri Sembilan is chosen as research sample for unsold shop houses units due to its strategic location which is near to KLIA, International Sepang Circuit, educational instituitions and surrounded by housing scheme but yet still has numbers of unsold units. Data of the research is obtained from literature review and survey question between developers, local authority, purchasers/tenant and local residents. Relative Importance Index (RII method is applied in identifying the critical factor of shop houses sale failure. Generally, the factors of sale failure are economy, demography, politic, location and access, public and basic facilities, financial loan, physical of product, current stock of shop houses upon completion, future potential of subsale and rental, developer’s background, promotion and marketing, speculation and time.

  13. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  14. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  15. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  16. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  17. New laser spectroscopic technique for stable-isotope ratio analysis

    International Nuclear Information System (INIS)

    Reliable and safe application of isotopes as tracers is important in many areas, including biomedical, environmental and geochronological sciences. A new approach to stable-isotope ratio analysis based on atomic hyperfine structure is demonstrated. This laser spectroscopic scheme is virtually interference-free because of the highly selective and specific nature of hyperfine structures. Hence, a minor constituent in a complex matrix can be selectively analyzed without extensive sample preparation. A single-frequency tunable cw ring dye laser is used as the excitation source and a specially designed and constructed demountable cathode discharge is used as the atomizer and detector. Samples are electrodeposited on the demountable cathode and hyperfine profiles are collected by optogalvanic detection. By spectral deconvolution, the relative abundances of all isotopes present can be determined with good accuracy and precision. The technique is demonstrated for copper concentrations as low as 1.6 ppm, using the atomic hyperfine structure of CuI 578.2 nm non-resonance transition. It is also successfully tested for analysis of copper isotopes in human blood

  18. Quick analysis techniques for assay of radioactivity in urine

    International Nuclear Information System (INIS)

    The needs for bioassay has recently increased in the radiation protection management at the nuclear power plant. In the practical application of bioassay, it is desirable to simplify pre-treatment procedures and to delete chemical separation treatments before the radiation measurement with a low-background liquid scintillation system. This paper presents the results on accumulation of background data of radioactivities in urine and assessment of the availability of a quick analysis method for urine-bioassay. The major results obtained are as follows: (1) The background concentration of 3H in human urine, which varies in both time and persons, is 50 - 240 pCi/l and equivalent to those in natural water. (2) A small quantity of non-treatment urine of 2 ml is sufficient to estimate the 3H concentration over the screening level in monitoring of the internal radiation exposure. (3) The average concentrations of 40K and 137Cs in human urine measured with a Ge detector are 2500 and 8 pCi/l, respectively, and the ratio of 137Cs to K is 3.14 pCi/g, which is applicable to determination of abnormal intake of 137Cs. (4) A simplified bioassay method using the quick analysis technique for non-treatment urine is proposed for monitoring the internal radiation exposure at the nuclear power plant. (author)

  19. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  20. Enhanced Analysis Techniques for an Imaging Neutron and Gamma Ray Spectrometer

    Science.gov (United States)

    Madden, Amanda C.

    The presence of gamma rays and neutrons is a strong indicator of the presence of Special Nuclear Material (SNM). The imaging Neutron and gamma ray SPECTrometer (NSPECT) developed by the University of New Hampshire and Michigan Aerospace corporation detects the fast neutrons and prompt gamma rays from fissile material, and the gamma rays from radioactive material. The instrument operates as a double scatter device, requiring a neutron or a gamma ray to interact twice in the instrument. While this detection requirement decreases the efficiency of the instrument, it offers superior background rejection and the ability to measure the energy and momentum of the incident particle. These measurements create energy spectra and images of the emitting source for source identification and localization. The dual species instrument provides superior detection than a single species alone. In realistic detection scenarios, few particles are detected from a potential threat due to source shielding, detection at a distance, high background, and weak sources. This contributes to a small signal to noise ratio, and threat detection becomes difficult. To address these difficulties, several enhanced data analysis tools were developed. A Receiver Operating Characteristic Curve (ROC) helps set instrumental alarm thresholds as well as to identify the presence of a source. Analysis of a dual-species ROC curve provides superior detection capabilities. Bayesian analysis helps to detect and identify the presence of a source through model comparisons, and helps create a background corrected count spectra for enhanced spectroscopy. Development of an instrument response using simulations and numerical analyses will help perform spectra and image deconvolution. This thesis will outline the principles of operation of the NSPECT instrument using the double scatter technology, traditional analysis techniques, and enhanced analysis techniques as applied to data from the NSPECT instrument, and an

  1. Protein functional links in Trypanosoma brucei, identified by gene fusion analysis

    Directory of Open Access Journals (Sweden)

    Trimpalis Philip

    2011-07-01

    Full Text Available Abstract Background Domain or gene fusion analysis is a bioinformatics method for detecting gene fusions in one organism by comparing its genome to that of other organisms. The occurrence of gene fusions suggests that the two original genes that participated in the fusion are functionally linked, i.e. their gene products interact either as part of a multi-subunit protein complex, or in a metabolic pathway. Gene fusion analysis has been used to identify protein functional links in prokaryotes as well as in eukaryotic model organisms, such as yeast and Drosophila. Results In this study we have extended this approach to include a number of recently sequenced protists, four of which are pathogenic, to identify fusion linked proteins in Trypanosoma brucei, the causative agent of African sleeping sickness. We have also examined the evolution of the gene fusion events identified, to determine whether they can be attributed to fusion or fission, by looking at the conservation of the fused genes and of the individual component genes across the major eukaryotic and prokaryotic lineages. We find relatively limited occurrence of gene fusions/fissions within the protist lineages examined. Our results point to two trypanosome-specific gene fissions, which have recently been experimentally confirmed, one fusion involving proteins involved in the same metabolic pathway, as well as two novel putative functional links between fusion-linked protein pairs. Conclusions This is the first study of protein functional links in T. brucei identified by gene fusion analysis. We have used strict thresholds and only discuss results which are highly likely to be genuine and which either have already been or can be experimentally verified. We discuss the possible impact of the identification of these novel putative protein-protein interactions, to the development of new trypanosome therapeutic drugs.

  2. Gene expression signature analysis identifies vorinostat as a candidate therapy for gastric cancer.

    Directory of Open Access Journals (Sweden)

    Sofie Claerhout

    Full Text Available BACKGROUND: Gastric cancer continues to be one of the deadliest cancers in the world and therefore identification of new drugs targeting this type of cancer is thus of significant importance. The purpose of this study was to identify and validate a therapeutic agent which might improve the outcomes for gastric cancer patients in the future. METHODOLOGY/PRINCIPAL FINDINGS: Using microarray technology, we generated a gene expression profile of human gastric cancer-specific genes from human gastric cancer tissue samples. We used this profile in the Broad Institute's Connectivity Map analysis to identify candidate therapeutic compounds for gastric cancer. We found the histone deacetylase inhibitor vorinostat as the lead compound and thus a potential therapeutic drug for gastric cancer. Vorinostat induced both apoptosis and autophagy in gastric cancer cell lines. Pharmacological and genetic inhibition of autophagy however, increased the therapeutic efficacy of vorinostat, indicating that a combination of vorinostat with autophagy inhibitors may therapeutically be more beneficial. Moreover, gene expression analysis of gastric cancer identified a collection of genes (ITGB5, TYMS, MYB, APOC1, CBX5, PLA2G2A, and KIF20A whose expression was elevated in gastric tumor tissue and downregulated more than 2-fold by vorinostat treatment in gastric cancer cell lines. In contrast, SCGB2A1, TCN1, CFD, APLP1, and NQO1 manifested a reversed pattern. CONCLUSIONS/SIGNIFICANCE: We showed that analysis of gene expression signature may represent an emerging approach to discover therapeutic agents for gastric cancer, such as vorinostat. The observation of altered gene expression after vorinostat treatment may provide the clue to identify the molecular mechanism of vorinostat and those patients likely to benefit from vorinostat treatment.

  3. Meta-analysis of microarray data using a pathway-based approach identifies a 37-gene expression signature for systemic lupus erythematosus in human peripheral blood mononuclear cells

    Directory of Open Access Journals (Sweden)

    Fang Hong

    2011-05-01

    Full Text Available Abstract Background A number of publications have reported the use of microarray technology to identify gene expression signatures to infer mechanisms and pathways associated with systemic lupus erythematosus (SLE in human peripheral blood mononuclear cells. However, meta-analysis approaches with microarray data have not been well-explored in SLE. Methods In this study, a pathway-based meta-analysis was applied to four independent gene expression oligonucleotide microarray data sets to identify gene expression signatures for SLE, and these data sets were confirmed by a fifth independent data set. Results Differentially expressed genes (DEGs were identified in each data set by comparing expression microarray data from control samples and SLE samples. Using Ingenuity Pathway Analysis software, pathways associated with the DEGs were identified in each of the four data sets. Using the leave one data set out pathway-based meta-analysis approach, a 37-gene metasignature was identified. This SLE metasignature clearly distinguished SLE patients from controls as observed by unsupervised learning methods. The final confirmation of the metasignature was achieved by applying the metasignature to a fifth independent data set. Conclusions The novel pathway-based meta-analysis approach proved to be a useful technique for grouping disparate microarray data sets. This technique allowed for validated conclusions to be drawn across four different data sets and confirmed by an independent fifth data set. The metasignature and pathways identified by using this approach may serve as a source for identifying therapeutic targets for SLE and may possibly be used for diagnostic and monitoring purposes. Moreover, the meta-analysis approach provides a simple, intuitive solution for combining disparate microarray data sets to identify a strong metasignature. Please see Research Highlight: http://genomemedicine.com/content/3/5/30

  4. Progress in identifying a human ionizing-radiation repair gene using DNA-mediated gene transfer techniques

    International Nuclear Information System (INIS)

    The authors employing DNA-mediated gene transfer techniques in introducing human DNA into a DNA double-strand break (DSB) repair deficient Chinese hamster (CHO) cell mutant (xrs-6), which is hypersensitive to both X-rays (D0 = 0.39 Gy) and the antibiotic bleomycin (D0 = 0.01 μg/ml). High molecular weight DNA isolated from cultured human skin fibroblasts was partially digested with restriction enzyme Sau 3A to average sizes of 20 or 40 Kb, ligated with plasmid pSV2-gpt DNA, and transfected into xrs-6 cells. Colonies which developed under a bleomycin and MAX (mycophenolic acid/adenine/xanthine) double-selection procedure were isolated and further tested for X-ray sensitivity and DSB rejoining capacity. To date a total of six X-ray or bleomycin resistant transformants have been isolated. All express rejoining capacity for X-ray-induced DSB, similar to the rate observed for DSB repair in CHO wild type cells. DNA isolated from these primary transformants contain various copy numbers of pSV2-gpt DNA and also contain human DNA sequences as determined by Southern blot hybridization. Recently, a secondary transformant has been isolated using DNA from one of the primary transformants. Cellular and molecular characterization of this transformant is in progress. DNA from a genuine secondary transformant will be used in the construction of a DNA library to isolate human genomic DNA encoding this radiation repair gene

  5. Modified C-band technique for the analysis of chromosome abnormalities in irradiated human lymphocytes

    International Nuclear Information System (INIS)

    A modified C-band technique was developed in order to analyze more accurately dicentric, tricentric, and ring chromosomes in irradiated human peripheral lymphocytes. Instead of the original method relying on treatment with barium hydroxide Ba(OH)2, C-bands were obtained using a modified form of heat treatment in formamide followed with DAPI staining. This method was tentatively applied to the analysis of dicentric chromosomes in irradiated human lymphocytes to examine its availability. The frequency of dicentric chromosome was almost the same with conventional Giemsa staining and the modified C-band technique. In the analysis using Giemsa staining, it is relatively difficult to identify the centromere on the elongated chromosomes, over-condensed chromosomes, fragment, and acentric ring. However, the modified C-band method used in this study makes it easier to identify the centromere on such chromosomes than with the use of Giemsa staining alone. Thus, the modified C-band method may give more information about the location of the centromere. Therefore, this method may be available and more useful for biological dose estimation due to the analysis of the dicentric chromosome in human lymphocytes exposed to the radiation. Furthermore, this method is simpler and faster than the original C-band protocol and fluorescence in situ hybridization (FISH) method with the centromeric DNA probe. - Highlights: → The dicentric (dic) assay is the most effective for the radiation biodosimetry. → It is important to recognize the centromere of the dic. → We improved a C-band technique based on heat denaturation. → This technique enables the accurate detection of a centromere. → This method may be available and more useful for biological dose estimation.

  6. Shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity.

    Directory of Open Access Journals (Sweden)

    J R Managbanag

    Full Text Available BACKGROUND: Identification of genes that modulate longevity is a major focus of aging-related research and an area of intense public interest. In addition to facilitating an improved understanding of the basic mechanisms of aging, such genes represent potential targets for therapeutic intervention in multiple age-associated diseases, including cancer, heart disease, diabetes, and neurodegenerative disorders. To date, however, targeted efforts at identifying longevity-associated genes have been limited by a lack of predictive power, and useful algorithms for candidate gene-identification have also been lacking. METHODOLOGY/PRINCIPAL FINDINGS: We have utilized a shortest-path network analysis to identify novel genes that modulate longevity in Saccharomyces cerevisiae. Based on a set of previously reported genes associated with increased life span, we applied a shortest-path network algorithm to a pre-existing protein-protein interaction dataset in order to construct a shortest-path longevity network. To validate this network, the replicative aging potential of 88 single-gene deletion strains corresponding to predicted components of the shortest-path longevity network was determined. Here we report that the single-gene deletion strains identified by our shortest-path longevity analysis are significantly enriched for mutations conferring either increased or decreased replicative life span, relative to a randomly selected set of 564 single-gene deletion strains or to the current data set available for the entire haploid deletion collection. Further, we report the identification of previously unknown longevity genes, several of which function in a conserved longevity pathway believed to mediate life span extension in response to dietary restriction. CONCLUSIONS/SIGNIFICANCE: This work demonstrates that shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity and represents the first application of

  7. Design Analysis Rules to Identify Proper Noun from Bengali Sentence for Universal Networking language

    Directory of Open Access Journals (Sweden)

    Md. Syeful Islam

    2014-08-01

    Full Text Available Now-a-days hundreds of millions of people of almost all levels of education and attitudes from different country communicate with each other for different purposes and perform their jobs on internet or other communication medium using various languages. Not all people know all language; therefore it is very difficult to communicate or works on various languages. In this situation the computer scientist introduce various inter language translation program (Machine translation. UNL is such kind of inter language translation program. One of the major problem of UNL is identified a name from a sentence, which is relatively simple in English language, because such entities start with a capital letter. In Bangla we do not have concept of small or capital letters. Thus we find difficulties in understanding whether a word is a proper noun or not. Here we have proposed analysis rules to identify proper noun from a sentence and established post converter which translate the name entity from Bangla to UNL. The goal is to make possible Bangla sentence conversion to UNL and vice versa. UNL system prove that the theoretical analysis of our proposed system able to identify proper noun from Bangla sentence and produce relative Universal word for UNL.

  8. Space-Time Analysis to Identify Areas at Risk of Mortality from Cardiovascular Disease

    Directory of Open Access Journals (Sweden)

    Poliany C. O. Rodrigues

    2015-01-01

    Full Text Available This study aimed at identifying areas that were at risk of mortality due to cardiovascular disease in residents aged 45 years or older of the cities of Cuiabá and Várzea Grande between 2009 and 2011. We conducted an ecological study of mortality rates related to cardiovascular disease. Mortality rates were calculated for each census tract by the Local Empirical Bayes estimator. High- and low-risk clusters were identified by retrospective space-time scans for each year using the Poisson probability model. We defined the year and month as the temporal analysis unit and the census tracts as the spatial analysis units adjusted by age and sex. The Mann-Whitney U test was used to compare the socioeconomic and environmental variables by risk classification. High-risk clusters showed higher income ratios than low-risk clusters, as did temperature range and atmospheric particulate matter. Low-risk clusters showed higher humidity than high-risk clusters. The Eastern region of Várzea Grande and the central region of Cuiabá were identified as areas at risk of mortality due to cardiovascular disease in individuals aged 45 years or older. High mortality risk was associated with socioeconomic and environmental factors. More high-risk clusters were observed at the end of the dry season.

  9. A dynamic mechanical analysis technique for porous media

    Science.gov (United States)

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  10. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40th ∼ 50th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  11. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  12. MEASURING THE LEANNESS OF SUPPLIERS USING PRINCIPAL COMPONENT ANALYSIS TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Zare Mehrjerdi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: A technique that helps management to reduce costs and improve quality is ‘lean supply chain management’, which focuses on the elimination of all wastes in every stage of the supply chain and is derived from ‘agile production’. This research aims to assess and rank the suppliers in an auto industry, based upon the concept of ‘production leanness’. The focus of this research is on the suppliers of a company called Touse-Omron Naein. We have examined the literature about leanness, and classified its criteria into ten dimensions and 76 factors. A questionnaire was used to collect the data, and the suppliers were ranked using the principal component analysis (PCA technique.

    AFRIKAANSE OPSOMMING: Lenige voorsieningsbestuur (“lean supply chain management” is ’n tegniek wat bestuur in staat stel om koste te verminder en gehalte te verbeter. Dit fokus op die vermindering van vermorsing op elke stadium van die voorsieningsketting en word afgelei van ratse vervaardiging (“agile production”. Hierdie navorsing poog om leweransiers in ’n motorbedryf te beoordeel aan die hand van die konsep van vervaardigingslenigheid (“production leanness”. Die navorsing fokus op leweransiers van ’n maatskappy genaamd Touse-Omron Naein. ’n Literatuurstudie aangaande lenigheid het gelei tot die klassifikasie van kriteria in tien dimensies en 76 faktore. ’n Vraelys is gebruik om die data te versamel en die leweransiers is in rangvolgorde geplaas aan die hand van die PCA-tegniek.

  13. Application of multivariate analysis techniques to safeguards of the electrochemical treatment of used nuclear fuel

    International Nuclear Information System (INIS)

    Highlights: • Multivariate analysis potentially provides better predictions of molten salt compositions. • The validity of concentration predictions could be evaluated using a residual analysis. • Changes in cell condition could be identified using features of multivariate analysis. - Abstract: Several countries have shown interest in developing the electrochemical treatment of used nuclear fuel (UNF), commonly termed pyroprocessing. From a proliferation perspective, an advantage of pyroprocessing is its inability to isolate pure plutonium. However, because plutonium is present in the process, it needs to be effectively safeguarded to protect against diversion of plutonium-containing materials that could be post-processed elsewhere. The most complicated unit to safeguard in the process is the electrorefiner where UNF is chemically separated into several material entities. Molten LiCl–KCl serves as the electrolyte into which actinides (including uranium and plutonium), rare earths, and other active metals from the fuel are partitioned after being oxidized to chloride salts. Various voltammetric methods are being developed to measure the concentration of actinides in the molten salt in near-real-time. However, these methods have mostly been applied to molten salt mixtures containing a single actinide. Unfortunately, the presence of multiple actinides will create interferences in the electrochemical responses which could make traditional analysis of voltammetric data inaccurate for determining individual species concentrations. It is proposed to use multivariate techniques to more accurately predict concentrations of multiple actinides from voltammetric data. Two techniques, principal component analysis and partial least squares, are demonstrated on experimental and simulated data for molten salt mixtures containing uranium and plutonium. These techniques consistently yielded more accurate predictions of uranium and plutonium concentrations than simply using

  14. Driving forces of change in environmental indicators an analysis based on divisia index decomposition techniques

    CERN Document Server

    González, Paula Fernández; Presno, Mª José

    2014-01-01

    This book addresses several index decomposition analysis methods to assess progress made by EU countries in the last decade in relation to energy and climate change concerns. Several applications of these techniques are carried out in order to decompose changes in both energy and environmental aggregates. In addition to this, a new methodology based on classical spline approximations is introduced, which provides useful mathematical and statistical properties. Once a suitable set of determinant factors has been identified, these decomposition methods allow the researcher to quantify the respec

  15. A meta-analysis of 120 246 individuals identifies 18 new loci for fibrinogen concentration.

    Science.gov (United States)

    de Vries, Paul S; Chasman, Daniel I; Sabater-Lleal, Maria; Chen, Ming-Huei; Huffman, Jennifer E; Steri, Maristella; Tang, Weihong; Teumer, Alexander; Marioni, Riccardo E; Grossmann, Vera; Hottenga, Jouke J; Trompet, Stella; Müller-Nurasyid, Martina; Zhao, Jing Hua; Brody, Jennifer A; Kleber, Marcus E; Guo, Xiuqing; Wang, Jie Jin; Auer, Paul L; Attia, John R; Yanek, Lisa R; Ahluwalia, Tarunveer S; Lahti, Jari; Venturini, Cristina; Tanaka, Toshiko; Bielak, Lawrence F; Joshi, Peter K; Rocanin-Arjo, Ares; Kolcic, Ivana; Navarro, Pau; Rose, Lynda M; Oldmeadow, Christopher; Riess, Helene; Mazur, Johanna; Basu, Saonli; Goel, Anuj; Yang, Qiong; Ghanbari, Mohsen; Willemsen, Gonneke; Rumley, Ann; Fiorillo, Edoardo; de Craen, Anton J M; Grotevendt, Anne; Scott, Robert; Taylor, Kent D; Delgado, Graciela E; Yao, Jie; Kifley, Annette; Kooperberg, Charles; Qayyum, Rehan; Lopez, Lorna M; Berentzen, Tina L; Räikkönen, Katri; Mangino, Massimo; Bandinelli, Stefania; Peyser, Patricia A; Wild, Sarah; Trégouët, David-Alexandre; Wright, Alan F; Marten, Jonathan; Zemunik, Tatijana; Morrison, Alanna C; Sennblad, Bengt; Tofler, Geoffrey; de Maat, Moniek P M; de Geus, Eco J C; Lowe, Gordon D; Zoledziewska, Magdalena; Sattar, Naveed; Binder, Harald; Völker, Uwe; Waldenberger, Melanie; Khaw, Kay-Tee; Mcknight, Barbara; Huang, Jie; Jenny, Nancy S; Holliday, Elizabeth G; Qi, Lihong; Mcevoy, Mark G; Becker, Diane M; Starr, John M; Sarin, Antti-Pekka; Hysi, Pirro G; Hernandez, Dena G; Jhun, Min A; Campbell, Harry; Hamsten, Anders; Rivadeneira, Fernando; Mcardle, Wendy L; Slagboom, P Eline; Zeller, Tanja; Koenig, Wolfgang; Psaty, Bruce M; Haritunians, Talin; Liu, Jingmin; Palotie, Aarno; Uitterlinden, André G; Stott, David J; Hofman, Albert; Franco, Oscar H; Polasek, Ozren; Rudan, Igor; Morange, Pierre-Emmanuel; Wilson, James F; Kardia, Sharon L R; Ferrucci, Luigi; Spector, Tim D; Eriksson, Johan G; Hansen, Torben; Deary, Ian J; Becker, Lewis C; Scott, Rodney J; Mitchell, Paul; März, Winfried; Wareham, Nick J; Peters, Annette; Greinacher, Andreas; Wild, Philipp S; Jukema, J Wouter; Boomsma, Dorret I; Hayward, Caroline; Cucca, Francesco; Tracy, Russell; Watkins, Hugh; Reiner, Alex P; Folsom, Aaron R; Ridker, Paul M; O'Donnell, Christopher J; Smith, Nicholas L; Strachan, David P; Dehghan, Abbas

    2016-01-15

    Genome-wide association studies have previously identified 23 genetic loci associated with circulating fibrinogen concentration. These studies used HapMap imputation and did not examine the X-chromosome. 1000 Genomes imputation provides better coverage of uncommon variants, and includes indels. We conducted a genome-wide association analysis of 34 studies imputed to the 1000 Genomes Project reference panel and including ∼120 000 participants of European ancestry (95 806 participants with data on the X-chromosome). Approximately 10.7 million single-nucleotide polymorphisms and 1.2 million indels were examined. We identified 41 genome-wide significant fibrinogen loci; of which, 18 were newly identified. There were no genome-wide significant signals on the X-chromosome. The lead variants of five significant loci were indels. We further identified six additional independent signals, including three rare variants, at two previously characterized loci: FGB and IRF1. Together the 41 loci explain 3% of the variance in plasma fibrinogen concentration. PMID:26561523

  16. The systematic functional analysis of plasmodium protein kinases identifies essential regulators of mosquito transmission

    KAUST Repository

    Tewari, Rita

    2010-10-21

    Although eukaryotic protein kinases (ePKs) contribute to many cellular processes, only three Plasmodium falciparum ePKs have thus far been identified as essential for parasite asexual blood stage development. To identify pathways essential for parasite transmission between their mammalian host and mosquito vector, we undertook a systematic functional analysis of ePKs in the genetically tractable rodent parasite Plasmodium berghei. Modeling domain signatures of conventional ePKs identified 66 putative Plasmodium ePKs. Kinomes are highly conserved between Plasmodium species. Using reverse genetics, we show that 23 ePKs are redundant for asexual erythrocytic parasite development in mice. Phenotyping mutants at four life cycle stages in Anopheles stephensi mosquitoes revealed functional clusters of kinases required for sexual development and sporogony. Roles for a putative SR protein kinase (SRPK) in microgamete formation, a conserved regulator of clathrin uncoating (GAK) in ookinete formation, and a likely regulator of energy metabolism (SNF1/KIN) in sporozoite development were identified. 2010 Elsevier Inc.

  17. Integrating Stakeholder Preferences and GIS-Based Multicriteria Analysis to Identify Forest Landscape Restoration Priorities

    Directory of Open Access Journals (Sweden)

    David Uribe

    2014-02-01

    Full Text Available A pressing question that arises during the planning of an ecological restoration process is: where to restore first? Answering this question is a complex task; it requires a multidimensional approach to consider economic constrains and the preferences of stakeholders. Being the problem of spatial nature, it may be explored effectively through Multicriteria Decision Analysis (MCDA performed in a Geographical Information System (GIS environment. The proposed approach is based on the definition and weighting of multiple criteria for evaluating land suitability. An MCDA-based methodology was used to identify priority areas for Forest Landscape Restoration in the Upper Mixtec region, Oaxaca (Mexico, one of the most degraded areas of Latin America. Socioeconomic and environmental criteria were selected and evaluated. The opinions of four different stakeholder groups were considered: general public, academic, Non-governmental organizations (NGOs and governmental officers. The preferences of these groups were spatially modeled to identify their priorities. The final result was a map that identifies the most preferable sites for restoration, where resources and efforts should be concentrated. MCDA proved to be a very useful tool in collective planning, when alternative sites have to be identified and prioritized to guide the restoration work.

  18. A new model to identify the productivity of theses in terms of articles using co-word analysis

    Directory of Open Access Journals (Sweden)

    Mery Piedad Zamudio Igami

    2014-01-01

    Full Text Available A thesis defense should be considered as not the end but the starting point for scientific communication flow. How many articles truly extend doctoral research? This article proposes a new model to automatically identify the productivity of theses in terms of article publications. We evaluate the use of the co-word analysis technique to establish relationships among 401 doctoral theses and 2,211 articles journal articles published by students in a graduate program at a Brazilian National Nuclear Research Institution (IPEN-CNEN/SP.To identify the relationship between a thesis and an article published by the same author, we used co-descriptor pairs from a controlled vocabulary. To validate the proposed model, a survey was applied to a random sample of theses authors (n = 128, response rate of 79%, thus establishing a minimum threshold of three coincident co-descriptors to identify the relationship between theses and articles. The agreement level between an author′s opinion and the automatic method was 86.9%, with a sampling error of 7.36%, which indicates an acceptable level of accuracy. Differences between the related or nonrelated distributions of articles were also demonstrated, as was a reduction in the median lag time to publication and the supervisor′s influence on student productivity.

  19. Identifying E-Business Model:A Value Chain-Based Analysis

    Institute of Scientific and Technical Information of China (English)

    ZENG Qingfeng; HUANG Lihua

    2004-01-01

    E-business will change the ways that all companies do business, and most traditional businesses will evolve from their current business model to a combination of place and space via e-business model To choose the proper e-business model becomes the important strategic concern for company to succeed The main objective of this paper is to investigate the analysis framework for identifying e-business model Based on the e-business process, from the value chain to the value net perspective. This paper provides a theoretical framework for identifying e-business models, and results in 11 e-business models. The strategic intend of every e-business model is discussed in the end of this paper. An enterprise e-business model design and implementation can be specified by the combination of one or more among 11 e-business models.

  20. Proteomic analysis of cell lines to identify the irinotecan resistance proteins

    Indian Academy of Sciences (India)

    Xing-Chen Peng; Feng-Ming Gong; Meng Wei; X I Chen; Y E Chen; K E Cheng; Feng Gao; Feng Xu; FENG Bi; Ji-Yan Liu

    2010-12-01

    Chemotherapeutic drug resistance is a frequent cause of treatment failure in colon cancer patients. Several mechanisms have been implicated in drug resistance. However, they are not sufficient to exhaustively account for this resistance emergence. In this study, two-dimensional gel electrophoresis (2-DE) and the PDQuest software analysis were applied to compare the differential expression of irinotecan-resistance-associated protein in human colon adenocarcinoma LoVo cells and irinotecan-resistant LoVo cells (LoVo/irinotecan). The differential protein dots were excised and analysed by ESI-Q-TOF mass spectrometry (MS). Fifteen proteins were identified, including eight proteins with decreased expression and seven proteins with increased expression. The identified known proteins included those that function in diverse biological processes such as cellular transcription, cell apoptosis, electron transport/redox regulation, cell proliferation/differentiation and retinol metabolism pathways. Identification of such proteins could allow improved understanding of the mechanisms leading to the acquisition of chemoresistance.

  1. Association of two techniques of frontal sinus radiographic analysis for human identification

    Directory of Open Access Journals (Sweden)

    Rhonan Ferreira da SILVA

    2009-09-01

    Full Text Available Introduction: The analysis of images with human identificationpurpose is a routine activity in the departments of forensic medicine, especially when is necessary to identify burned bodies, skeletal remains or corpses in advanced stage of decomposition. Case report: The feasibility and reliability of the analysis of the morphoradiographic image of the frontal sinus is showed, displayed in a posteroanterior (PA radiography of skull produced in life compared to another produced post-death. Conclusion: The results obtained in the radiographic comparison through the association of two different techniques of analysis of the frontal sinus allowed a positive correlation of the identity of the disappeared person with the body in an advanced stage of decomposition.

  2. Gene-network analysis identifies susceptibility genes related to glycobiology in autism.

    Directory of Open Access Journals (Sweden)

    Bert van der Zwaag

    Full Text Available The recent identification of copy-number variation in the human genome has opened up new avenues for the discovery of positional candidate genes underlying complex genetic disorders, especially in the field of psychiatric disease. One major challenge that remains is pinpointing the susceptibility genes in the multitude of disease-associated loci. This challenge may be tackled by reconstruction of functional gene-networks from the genes residing in these loci. We applied this approach to autism spectrum disorder (ASD, and identified the copy-number changes in the DNA of 105 ASD patients and 267 healthy individuals with Illumina Humanhap300 Beadchips. Subsequently, we used a human reconstructed gene-network, Prioritizer, to rank candidate genes in the segmental gains and losses in our autism cohort. This analysis highlighted several candidate genes already known to be mutated in cognitive and neuropsychiatric disorders, including RAI1, BRD1, and LARGE. In addition, the LARGE gene was part of a sub-network of seven genes functioning in glycobiology, present in seven copy-number changes specifically identified in autism patients with limited co-morbidity. Three of these seven copy-number changes were de novo in the patients. In autism patients with a complex phenotype and healthy controls no such sub-network was identified. An independent systematic analysis of 13 published autism susceptibility loci supports the involvement of genes related to glycobiology as we also identified the same or similar genes from those loci. Our findings suggest that the occurrence of genomic gains and losses of genes associated with glycobiology are important contributors to the development of ASD.

  3. Systematic enrichment analysis of gene expression profiling studies identifies consensus pathways implicated in colorectal cancer development

    Directory of Open Access Journals (Sweden)

    Jesús Lascorz

    2011-01-01

    Full Text Available Background: A large number of gene expression profiling (GEP studies on colorectal carcinogenesis have been performed but no reliable gene signature has been identified so far due to the lack of reproducibility in the reported genes. There is growing evidence that functionally related genes, rather than individual genes, contribute to the etiology of complex traits. We used, as a novel approach, pathway enrichment tools to define functionally related genes that are consistently up- or down-regulated in colorectal carcinogenesis. Materials and Methods: We started the analysis with 242 unique annotated genes that had been reported by any of three recent meta-analyses covering GEP studies on genes differentially expressed in carcinoma vs normal mucosa. Most of these genes (218, 91.9% had been reported in at least three GEP studies. These 242 genes were submitted to bioinformatic analysis using a total of nine tools to detect enrichment of Gene Ontology (GO categories or Kyoto Encyclopedia of Genes and Genomes (KEGG pathways. As a final consistency criterion the pathway categories had to be enriched by several tools to be taken into consideration. Results: Our pathway-based enrichment analysis identified the categories of ribosomal protein constituents, extracellular matrix receptor interaction, carbonic anhydrase isozymes, and a general category related to inflammation and cellular response as significantly and consistently overrepresented entities. Conclusions: We triaged the genes covered by the published GEP literature on colorectal carcinogenesis and subjected them to multiple enrichment tools in order to identify the consistently enriched gene categories. These turned out to have known functional relationships to cancer development and thus deserve further investigation.

  4. Large-scale gene-centric meta-analysis across 32 studies identifies multiple lipid loci.

    Science.gov (United States)

    Asselbergs, Folkert W; Guo, Yiran; van Iperen, Erik P A; Sivapalaratnam, Suthesh; Tragante, Vinicius; Lanktree, Matthew B; Lange, Leslie A; Almoguera, Berta; Appelman, Yolande E; Barnard, John; Baumert, Jens; Beitelshees, Amber L; Bhangale, Tushar R; Chen, Yii-Der Ida; Gaunt, Tom R; Gong, Yan; Hopewell, Jemma C; Johnson, Toby; Kleber, Marcus E; Langaee, Taimour Y; Li, Mingyao; Li, Yun R; Liu, Kiang; McDonough, Caitrin W; Meijs, Matthijs F L; Middelberg, Rita P S; Musunuru, Kiran; Nelson, Christopher P; O'Connell, Jeffery R; Padmanabhan, Sandosh; Pankow, James S; Pankratz, Nathan; Rafelt, Suzanne; Rajagopalan, Ramakrishnan; Romaine, Simon P R; Schork, Nicholas J; Shaffer, Jonathan; Shen, Haiqing; Smith, Erin N; Tischfield, Sam E; van der Most, Peter J; van Vliet-Ostaptchouk, Jana V; Verweij, Niek; Volcik, Kelly A; Zhang, Li; Bailey, Kent R; Bailey, Kristian M; Bauer, Florianne; Boer, Jolanda M A; Braund, Peter S; Burt, Amber; Burton, Paul R; Buxbaum, Sarah G; Chen, Wei; Cooper-Dehoff, Rhonda M; Cupples, L Adrienne; deJong, Jonas S; Delles, Christian; Duggan, David; Fornage, Myriam; Furlong, Clement E; Glazer, Nicole; Gums, John G; Hastie, Claire; Holmes, Michael V; Illig, Thomas; Kirkland, Susan A; Kivimaki, Mika; Klein, Ronald; Klein, Barbara E; Kooperberg, Charles; Kottke-Marchant, Kandice; Kumari, Meena; LaCroix, Andrea Z; Mallela, Laya; Murugesan, Gurunathan; Ordovas, Jose; Ouwehand, Willem H; Post, Wendy S; Saxena, Richa; Scharnagl, Hubert; Schreiner, Pamela J; Shah, Tina; Shields, Denis C; Shimbo, Daichi; Srinivasan, Sathanur R; Stolk, Ronald P; Swerdlow, Daniel I; Taylor, Herman A; Topol, Eric J; Toskala, Elina; van Pelt, Joost L; van Setten, Jessica; Yusuf, Salim; Whittaker, John C; Zwinderman, A H; Anand, Sonia S; Balmforth, Anthony J; Berenson, Gerald S; Bezzina, Connie R; Boehm, Bernhard O; Boerwinkle, Eric; Casas, Juan P; Caulfield, Mark J; Clarke, Robert; Connell, John M; Cruickshanks, Karen J; Davidson, Karina W; Day, Ian N M; de Bakker, Paul I W; Doevendans, Pieter A; Dominiczak, Anna F; Hall, Alistair S; Hartman, Catharina A; Hengstenberg, Christian; Hillege, Hans L; Hofker, Marten H; Humphries, Steve E; Jarvik, Gail P; Johnson, Julie A; Kaess, Bernhard M; Kathiresan, Sekar; Koenig, Wolfgang; Lawlor, Debbie A; März, Winfried; Melander, Olle; Mitchell, Braxton D; Montgomery, Grant W; Munroe, Patricia B; Murray, Sarah S; Newhouse, Stephen J; Onland-Moret, N Charlotte; Poulter, Neil; Psaty, Bruce; Redline, Susan; Rich, Stephen S; Rotter, Jerome I; Schunkert, Heribert; Sever, Peter; Shuldiner, Alan R; Silverstein, Roy L; Stanton, Alice; Thorand, Barbara; Trip, Mieke D; Tsai, Michael Y; van der Harst, Pim; van der Schoot, Ellen; van der Schouw, Yvonne T; Verschuren, W M Monique; Watkins, Hugh; Wilde, Arthur A M; Wolffenbuttel, Bruce H R; Whitfield, John B; Hovingh, G Kees; Ballantyne, Christie M; Wijmenga, Cisca; Reilly, Muredach P; Martin, Nicholas G; Wilson, James G; Rader, Daniel J; Samani, Nilesh J; Reiner, Alex P; Hegele, Robert A; Kastelein, John J P; Hingorani, Aroon D; Talmud, Philippa J; Hakonarson, Hakon; Elbers, Clara C; Keating, Brendan J; Drenos, Fotios

    2012-11-01

    Genome-wide association studies (GWASs) have identified many SNPs underlying variations in plasma-lipid levels. We explore whether additional loci associated with plasma-lipid phenotypes, such as high-density lipoprotein cholesterol (HDL-C), low-density lipoprotein cholesterol (LDL-C), total cholesterol (TC), and triglycerides (TGs), can be identified by a dense gene-centric approach. Our meta-analysis of 32 studies in 66,240 individuals of European ancestry was based on the custom ∼50,000 SNP genotyping array (the ITMAT-Broad-CARe array) covering ∼2,000 candidate genes. SNP-lipid associations were replicated either in a cohort comprising an additional 24,736 samples or within the Global Lipid Genetic Consortium. We identified four, six, ten, and four unreported SNPs in established lipid genes for HDL-C, LDL-C, TC, and TGs, respectively. We also identified several lipid-related SNPs in previously unreported genes: DGAT2, HCAR2, GPIHBP1, PPARG, and FTO for HDL-C; SOCS3, APOH, SPTY2D1, BRCA2, and VLDLR for LDL-C; SOCS3, UGT1A1, BRCA2, UBE3B, FCGR2A, CHUK, and INSIG2 for TC; and SERPINF2, C4B, GCK, GATA4, INSR, and LPAL2 for TGs. The proportion of explained phenotypic variance in the subset of studies providing individual-level data was 9.9% for HDL-C, 9.5% for LDL-C, 10.3% for TC, and 8.0% for TGs. This large meta-analysis of lipid phenotypes with the use of a dense gene-centric approach identified multiple SNPs not previously described in established lipid genes and several previously unknown loci. The explained phenotypic variance from this approach was comparable to that from a meta-analysis of GWAS data, suggesting that a focused genotyping approach can further increase the understanding of heritability of plasma lipids. PMID:23063622

  5. Proteomic analysis of mammalian sperm cells identifies new components of the centrosome

    OpenAIRE

    Fırat, Elif Nur Karalar; Sante, Joshua; Elliott, Sarah; Stearns, Tim

    2014-01-01

    1 Proteomic analysis of mammalian sperm cells identifies new components of the centrosome Elif N. Firat-Karalar1,3, Joshua Sante1,4, Sarah Elliott1, and Tim Stearns1,2 1Department of Biology, Stanford University, Stanford, CA 94305-5020, USA 2Department of Genetics, Stanford School of Medicine, Stanford, CA 94305-5120, USA 3Current address: Department of Molecular Biology and Genetics, Koç University, Istanbul, 34450, Turkey 4Current address: Division of Pulmonary and ...

  6. Towards a typology of business process management professionals: identifying patterns of competences through latent semantic analysis

    DEFF Research Database (Denmark)

    Müller, Oliver; Schmiedel, Theresa; Gorbacheva, Elena; vom Brocke, Jan

    2016-01-01

    -related job advertisements in order to develop a typology of BPM professionals. This empirical analysis reveals distinct ideal types and profiles of BPM professionals on several levels of abstraction. A closer look at these ideal types and profiles confirms that BPM is a boundary-spanning field that requires...... interdisciplinary sets of competence that range from technical competences to business and systems competences. Based on the study’s findings, it is posited that individual and organisational alignment with the identified ideal types and profiles is likely to result in high employability and organisational BPM...

  7. Use of Antibiotic Resistance Analysis To Identify Nonpoint Sources of Fecal Pollution

    OpenAIRE

    Wiggins, B A; Andrews, R. W.; Conway, R. A.; Corr, C. L.; Dobratz, E. J.; Dougherty, D. P.; Eppard, J. R.; Knupp, S. R.; Limjoco, M. C.; Mettenburg, J. M.; Rinehardt, J. M.; Sonsino, J.; Torrijos, R. L.; Zimmerman, M.E.

    1999-01-01

    A study was conducted to determine the reliability and repeatability of antibiotic resistance analysis as a method of identifying the sources of fecal pollution in surface water and groundwater. Four large sets of isolates of fecal streptococci (from 2,635 to 5,990 isolates per set) were obtained from 236 samples of human sewage and septage, cattle and poultry feces, and pristine waters. The patterns of resistance of the isolates to each of four concentrations of up to nine antibiotics were a...

  8. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  9. Romanian medieval earring analysis by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100μm. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two large earrings

  10. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  11. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente;

    2005-01-01

    aqueous selenium standards were separated within 1.2 min on a 1.00 id x 50 mm reversed phase column in an ion-pair chromatographic system using a flow rate of 200 mu L min(-1). Hence, analysis times could be reduced to 1/10 compared with ordinary HPLC for aqueous standards. The precision and detection......Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four...... limits were comparable to values obtained by HPLC. Detection limits were better than 0.4 mu g Se L-1. A urine sample was analysed on a 1.0 id x 100 mm column within 5 min using a flow rate of 100 mu L min(-1). The improved separation efficiency, owing to the use of 1.7 mu m column particles, allowed the...

  12. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag+, Ba2+, and Cd2+ in the concentration range from 10 ng/g to 1 μg/g; for Cu2+ and Pb2+ from 10 ng/g to 5 μg/g; and for Hg2+ from 10 ng/g to 10 μg/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 μg/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag+, Ba2+, Cd2+, Cu2+, and Pb2+, share common binding sites with binding efficiencies varying in the sequence of Pb2+>Cu2+>Ag2+>Cd2+>Ba2+. The binding of Hg2+ involved a different binding site with an increase in binding efficiency in the presence of Ag+. (orig.)

  13. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  14. Seismic margin analysis technique for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed.

  15. Comparative Analysis of Different LIDAR System Calibration Techniques

    Science.gov (United States)

    Miller, M.; Habib, A.

    2016-06-01

    With light detection and ranging (LiDAR) now being a crucial tool for engineering products and on the fly spatial analysis, it is necessary for the user community to have standardized calibration methods. The three methods in this study were developed and proven by the Digital Photogrammetry Research Group (DPRG) for airborne LiDAR systems and are as follows; Simplified, Quasi-Rigorous, and Rigorous. In lieu of using expensive control surfaces for calibration, these methods compare overlapping LiDAR strips to estimate the systematic errors. These systematic errors are quantified by these methods and include the lever arm biases, boresight biases, range bias and scan angle scale bias. These three methods comprehensively represent all of the possible flight configurations and data availability and this paper will test the limits of the method with the most assumptions, the simplified calibration, by using data that violates the assumptions it's math model is based on and compares the results to the quasi-rigorous and rigorous techniques. The overarching goal is to provide a LiDAR system calibration that does not require raw measurements which can be carried out with minimal control and flight lines to reduce costs. This testing is unique because the terrain used for calibration does not contain gable roofs, all other LiDAR system calibration testing and development has been done with terrain containing features with high geometric integrity such as gable roofs.

  16. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Balaraman Kumar

    2010-06-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.

  17. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  18. An analysis of spectral transformation techniques on graphs

    Science.gov (United States)

    Djurović, Igor; Sejdić, Ervin; Bulatović, Nikola; Simeunović, Marko

    2015-05-01

    Emerging methods for the spectral analysis of graphs are analyzed in this paper, as graphs are currently used to study interactions in many fields from neuroscience to social networks. There are two main approaches related to the spectral transformation of graphs. The first approach is based on the Laplacian matrix. The graph Fourier transform is defined as an expansion of a graph signal in terms of eigenfunctions of the graph Laplacian. The calculated eigenvalues carry the notion of frequency of graph signals. The second approach is based on the graph weighted adjacency matrix, as it expands the graph signal into a basis of eigenvectors of the adjacency matrix instead of the graph Laplacian. Here, the notion of frequency is then obtained from the eigenvalues of the adjacency matrix or its Jordan decomposition. In this paper, advantages and drawbacks of both approaches are examined. Potential challenges and improvements to graph spectral processing methods are considered as well as the generalization of graph processing techniques in the spectral domain. Its generalization to the time-frequency domain and other potential extensions of classical signal processing concepts to graph datasets are also considered. Lastly, it is given an overview of the compressive sensing on graphs concepts.

  19. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  20. Identifying Chemistry Prospective Teachers' Difficulties Encountered in Practice of The Subject Area Textbook Analysis Course

    Directory of Open Access Journals (Sweden)

    Zeynep Bak Kibar

    2010-12-01

    Full Text Available Prospective teachers should already be aware of possible mistakes in the textbooks and have knowledge of textbooks selection procedure and criteria. These knowledge is tried to being gained to prospective teachers at the Subject Area Textbook Analysis Course. It is important to identify the difficulties they encountered and the skills they gained from the point of implementing effectively this lesson. To research these problems, a case study was realized with 38 student teachers from Department of Secondary Science and Mathematics Education Chemistry Teaching Program at the Karadeniz Technical University Faculty of Fatih Education. Results suggest that prospective teachers gained the knowledge of research, teaching life, writing report, and analyzing textbook. Also, it was determined that they had difficulties in group working, literature reviewing, report writing, analyzing textbook, and critical analysis.

  1. Gene expression meta-analysis identifies metastatic pathways and transcription factors in breast cancer

    DEFF Research Database (Denmark)

    Thomassen, Mads; Tan, Qihua; Kruse, Torben

    2008-01-01

    studies. Besides classification of outcome, these global expression patterns may reflect biological mechanisms involved in metastasis of breast cancer. Our purpose has been to investigate pathways and transcription factors involved in metastasis by use of gene expression data sets. METHODS: We have...... tumors compared to non-metastasizing tumors. Meta-analysis has been used to determine overrepresentation of pathways and transcription factors targets, concordant deregulated in metastasizing breast tumors, in several data sets. RESULTS: The major findings are upregulation of cell cycle pathways and a...... system, angiogenesis, DNA repair and several signal transduction pathways are associated to metastasis. Finally several transcription factors e.g. E2F, NFY, and YY1 are identified as being involved in metastasis. CONCLUSIONS: By pathway meta-analysis many biological mechanisms beyond major...

  2. Whole Genome Analysis of Injectional Anthrax Identifies Two Disease Clusters Spanning More Than 13 Years

    Directory of Open Access Journals (Sweden)

    Paul Keim

    2015-11-01

    Lay Person Interpretation: Injectional anthrax has been plaguing heroin drug users across Europe for more than 10 years. In order to better understand this outbreak, we assessed genomic relationships of all available injectional anthrax strains from four countries spanning a >12 year period. Very few differences were identified using genome-based analysis, but these differentiated the isolates into two distinct clusters. This strongly supports a hypothesis of at least two separate anthrax spore contamination events perhaps during the drug production processes. Identification of two events would not have been possible from standard epidemiological analysis. These comprehensive data will be invaluable for classifying future injectional anthrax isolates and for future geographic attribution.

  3. Towards a typology of business process management professionals: identifying patterns of competences through latent semantic analysis

    Science.gov (United States)

    Müller, Oliver; Schmiedel, Theresa; Gorbacheva, Elena; vom Brocke, Jan

    2016-01-01

    While researchers have analysed the organisational competences that are required for successful Business Process Management (BPM) initiatives, individual BPM competences have not yet been studied in detail. In this study, latent semantic analysis is used to examine a collection of 1507 BPM-related job advertisements in order to develop a typology of BPM professionals. This empirical analysis reveals distinct ideal types and profiles of BPM professionals on several levels of abstraction. A closer look at these ideal types and profiles confirms that BPM is a boundary-spanning field that requires interdisciplinary sets of competence that range from technical competences to business and systems competences. Based on the study's findings, it is posited that individual and organisational alignment with the identified ideal types and profiles is likely to result in high employability and organisational BPM success.

  4. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N; Sprenger, Richard Remko

    2013-01-01

    combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for......The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a...

  5. COMBINED GEOPHYSICAL INVESTIGATION TECHNIQUES TO IDENTIFY BURIED WASTE IN AN UNCONTROLLED LANDFILL AT THE PADUCAH GASEOUS DIFFUSION PLANT, KENTUCKY

    International Nuclear Information System (INIS)

    survey used a 200 megahertz (MHz) antenna to provide the maximum depth penetration and subsurface detail yielding usable signals to a depth of about 6 to 10 feet in this environment and allowed discrimination of objects that were deeper, particularly useful in the southern area of the site where shallow depth metallic debris (primarily roof flashing) complicated interpretation of the EM and magnetic data. Several geophysical anomalies were defined on the contour plots that indicated the presence of buried metal. During the first phase of the project, nine anomalies or anomalous areas were detected. The sizes, shapes, and magnitudes of the anomalies varied considerably, but given the anticipated size of the primary target of the investigation, only the most prominent anomalies were considered as potential caches of 30 to 60 buried drums. After completion of a second phase investigation, only two of the anomalies were of sufficient magnitude, not identifiable with existing known metallic objects such as monitoring wells, and in positions that corresponded to the location of alleged dumping activities and were recommended for further, intrusive investigation. Other important findings, based on the variable frequency EM method and its combination with total field magnetic and GPR data, included the confirmation of the position of the old NSDD, the ability to differentiate between ferrous and non-ferrous anomalies, and the detection of what may be plumes emanating from the landfill cell

  6. COMBINED GEOPHYSICAL INVESTIGATION TECHNIQUES TO IDENTIFY BURIED WASTE IN AN UNCONTROLLED LANDFILL AT THE PADUCAH GASEOUS DIFFUSION PLANT, KENTUCKY

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Peter T.; Starmer, R. John

    2003-02-27

    survey used a 200 megahertz (MHz) antenna to provide the maximum depth penetration and subsurface detail yielding usable signals to a depth of about 6 to 10 feet in this environment and allowed discrimination of objects that were deeper, particularly useful in the southern area of the site where shallow depth metallic debris (primarily roof flashing) complicated interpretation of the EM and magnetic data. Several geophysical anomalies were defined on the contour plots that indicated the presence of buried metal. During the first phase of the project, nine anomalies or anomalous areas were detected. The sizes, shapes, and magnitudes of the anomalies varied considerably, but given the anticipated size of the primary target of the investigation, only the most prominent anomalies were considered as potential caches of 30 to 60 buried drums. After completion of a second phase investigation, only two of the anomalies were of sufficient magnitude, not identifiable with existing known metallic objects such as monitoring wells, and in positions that corresponded to the location of alleged dumping activities and were recommended for further, intrusive investigation. Other important findings, based on the variable frequency EM method and its combination with total field magnetic and GPR data, included the confirmation of the position of the old NSDD, the ability to differentiate between ferrous and non-ferrous anomalies, and the detection of what may be plumes emanating from the landfill cell.

  7. Study of fatigue damage micromechanisms in a duplex stainless steel by complementary analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    El Bartali, Ahmed; Aubin, Veronique; Degallaix, Suzanne [Laboratoire de Mecanique de Lille, LML UMR CNRS Ecole Centrale de Lille, Villeneuve d' Ascq (France)

    2009-09-15

    The low-cycle fatigue (LCF) damage micromechanisms are studied in a duplex stainless steel at room temperature using complementary analysis techniques. Surface damage is observed in real-time with an in-situ microscopic device during a low-cycle fatigue test. Slip systems activated in each grain in each phase are identified from SEM photographs and EBSD measurements. The surface relief appeared at the end of the test is measured with an interferometric profilometer. Displacement and strain fields on the microstructural scale are calculated using DIC technique from surface images taken during cycling. Observations were combined to analyse damage mechanisms from slip marking appearance to strain localisation and crack initiation. (Abstract Copyright [2009], Wiley Periodicals, Inc.)

  8. Hot spot analysis applied to identify ecosystem services potential in Lithuania

    Science.gov (United States)

    Pereira, Paulo; Depellegrin, Daniel; Misiune, Ieva

    2016-04-01

    Hot spot analysis are very useful to identify areas with similar characteristics. This is important for a sustainable use of the territory, since we can identify areas that need to be protected, or restored. This is a great advantage in terms of land use planning and management, since we can allocate resources, reduce the economical costs and do a better intervention in the landscape. Ecosystem services (ES) are different according land use. Since landscape is very heterogeneous, it is of major importance understand their spatial pattern and where are located the areas that provide better ES and the others that provide less services. The objective of this work is to use hot-spot analysis to identify areas with the most valuable ES in Lithuania. CORINE land-cover (CLC) of 2006 was used as the main spatial information. This classification uses a grid of 100 m resolution and extracted a total of 31 land use types. ES ranking was carried out based on expert knowledge. They were asked to evaluate the ES potential of each different CLC from 0 (no potential) to 5 (very high potential). Hot spot analysis were evaluated using the Getis-ord test, which identifies cluster analysis available in ArcGIS toolbox. This tool identifies areas with significantly high low values and significant high values at a p level of 0.05. In this work we used hot spot analysis to assess the distribution of providing, regulating cultural and total (sum of the previous 3) ES. The Z value calculated from Getis-ord was used to statistical analysis to access the clusters of providing, regulating cultural and total ES. ES with high Z value show that they have a high number of cluster areas with high potential of ES. The results showed that the Z-score was significantly different among services (Kruskal Wallis ANOVA =834. 607, p<0.001). The Z score of providing services (0.096±2.239) were significantly higher than the total (0.093±2.045), cultural (0.080±1.979) and regulating (0.076±1.961). These

  9. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    Science.gov (United States)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  10. Spatial and Temporal Dust Source Variability in Northern China Identified Using Advanced Remote Sensing Analysis

    Science.gov (United States)

    Taramelli, A.; Pasqui, M.; Barbour, J.; Kirschbaum, D.; Bottai, L.; Busillo, C.; Calastrini, F.; Guarnieri, F.; Small, C.

    2013-01-01

    The aim of this research is to provide a detailed characterization of spatial patterns and temporal trends in the regional and local dust source areas within the desert of the Alashan Prefecture (Inner Mongolia, China). This problem was approached through multi-scale remote sensing analysis of vegetation changes. The primary requirements for this regional analysis are high spatial and spectral resolution data, accurate spectral calibration and good temporal resolution with a suitable temporal baseline. Landsat analysis and field validation along with the low spatial resolution classifications from MODIS and AVHRR are combined to provide a reliable characterization of the different potential dust-producing sources. The representation of intra-annual and inter-annual Normalized Difference Vegetation Index (NDVI) trend to assess land cover discrimination for mapping potential dust source using MODIS and AVHRR at larger scale is enhanced by Landsat Spectral Mixing Analysis (SMA). The combined methodology is to determine the extent to which Landsat can distinguish important soils types in order to better understand how soil reflectance behaves at seasonal and inter-annual timescales. As a final result mapping soil surface properties using SMA is representative of responses of different land and soil cover previously identified by NDVI trend. The results could be used in dust emission models even if they are not reflecting aggregate formation, soil stability or particle coatings showing to be critical for accurately represent dust source over different regional and local emitting areas.

  11. Identifying Talent in Youth Sport: A Novel Methodology Using Higher-Dimensional Analysis.

    Science.gov (United States)

    Till, Kevin; Jones, Ben L; Cobley, Stephen; Morley, David; O'Hara, John; Chapman, Chris; Cooke, Carlton; Beggs, Clive B

    2016-01-01

    Prediction of adult performance from early age talent identification in sport remains difficult. Talent identification research has generally been performed using univariate analysis, which ignores multivariate relationships. To address this issue, this study used a novel higher-dimensional model to orthogonalize multivariate anthropometric and fitness data from junior rugby league players, with the aim of differentiating future career attainment. Anthropometric and fitness data from 257 Under-15 rugby league players was collected. Players were grouped retrospectively according to their future career attainment (i.e., amateur, academy, professional). Players were blindly and randomly divided into an exploratory (n = 165) and validation dataset (n = 92). The exploratory dataset was used to develop and optimize a novel higher-dimensional model, which combined singular value decomposition (SVD) with receiver operating characteristic analysis. Once optimized, the model was tested using the validation dataset. SVD analysis revealed 60 m sprint and agility 505 performance were the most influential characteristics in distinguishing future professional players from amateur and academy players. The exploratory dataset model was able to distinguish between future amateur and professional players with a high degree of accuracy (sensitivity = 85.7%, specificity = 71.1%; p<0.001), although it could not distinguish between future professional and academy players. The validation dataset model was able to distinguish future professionals from the rest with reasonable accuracy (sensitivity = 83.3%, specificity = 63.8%; p = 0.003). Through the use of SVD analysis it was possible to objectively identify criteria to distinguish future career attainment with a sensitivity over 80% using anthropometric and fitness data alone. As such, this suggests that SVD analysis may be a useful analysis tool for research and practice within talent identification. PMID:27224653

  12. Identifying Talent in Youth Sport: A Novel Methodology Using Higher-Dimensional Analysis.

    Directory of Open Access Journals (Sweden)

    Kevin Till

    Full Text Available Prediction of adult performance from early age talent identification in sport remains difficult. Talent identification research has generally been performed using univariate analysis, which ignores multivariate relationships. To address this issue, this study used a novel higher-dimensional model to orthogonalize multivariate anthropometric and fitness data from junior rugby league players, with the aim of differentiating future career attainment. Anthropometric and fitness data from 257 Under-15 rugby league players was collected. Players were grouped retrospectively according to their future career attainment (i.e., amateur, academy, professional. Players were blindly and randomly divided into an exploratory (n = 165 and validation dataset (n = 92. The exploratory dataset was used to develop and optimize a novel higher-dimensional model, which combined singular value decomposition (SVD with receiver operating characteristic analysis. Once optimized, the model was tested using the validation dataset. SVD analysis revealed 60 m sprint and agility 505 performance were the most influential characteristics in distinguishing future professional players from amateur and academy players. The exploratory dataset model was able to distinguish between future amateur and professional players with a high degree of accuracy (sensitivity = 85.7%, specificity = 71.1%; p<0.001, although it could not distinguish between future professional and academy players. The validation dataset model was able to distinguish future professionals from the rest with reasonable accuracy (sensitivity = 83.3%, specificity = 63.8%; p = 0.003. Through the use of SVD analysis it was possible to objectively identify criteria to distinguish future career attainment with a sensitivity over 80% using anthropometric and fitness data alone. As such, this suggests that SVD analysis may be a useful analysis tool for research and practice within talent identification.

  13. Identifying the oil price-macroeconomy relationship: An empirical mode decomposition analysis of US data

    International Nuclear Information System (INIS)

    This paper employs the empirical mode decomposition (EMD) method to filter cyclical components of US quarterly gross domestic product (GDP) and quarterly average oil price (West Texas Intermediate-WTI). The method is adaptive and applicable to non-linear and non-stationary data. A correlation analysis of the resulting components is performed and examined for insights into the relationship between oil and the economy. Several components of this relationship are identified. However, the principal one is that the medium-run component of the oil price has a negative relationship with the main cyclical component of the GDP. In addition, weak correlations suggesting a lagging, demand-driven component and a long-run component of the relationship were also identified. Comparisons of these findings with significant oil supply disruption and recession dates were supportive. The study identifies a number of lessons applicable to recent oil market events, including the eventuality of persistent oil price and economic decline following a long oil price run-up. In addition, it was found that oil market related exogenous events are associated with short- to medium-run price implications regardless of whether they lead to actual supply losses.

  14. Identifying the oil price-macroeconomy relationship. An empirical mode decomposition analysis of US data

    International Nuclear Information System (INIS)

    This paper employs the empirical mode decomposition (EMD) method to filter cyclical components of US quarterly gross domestic product (GDP) and quarterly average oil price (West Texas Intermediate - WTI). The method is adaptive and applicable to non-linear and non-stationary data. A correlation analysis of the resulting components is performed and examined for insights into the relationship between oil and the economy. Several components of this relationship are identified. However, the principal one is that the medium-run component of the oil price has a negative relationship with the main cyclical component of the GDP. In addition, weak correlations suggesting a lagging, demand-driven component and a long-run component of the relationship were also identified. Comparisons of these findings with significant oil supply disruption and recession dates were supportive. The study identifies a number of lessons applicable to recent oil market events, including the eventuality of persistent oil price and economic decline following a long oil price run-up. In addition, it was found that oil market related exogenous events are associated with short- to medium-run price implications regardless of whether they lead to actual supply losses. (author)

  15. Computational EST database analysis identifies a novel member of the neuropoietic cytokine family.

    Science.gov (United States)

    Shi, Y; Wang, W; Yourey, P A; Gohari, S; Zukauskas, D; Zhang, J; Ruben, S; Alderson, R F

    1999-08-19

    A novel member of the neuropoietic cytokine family has been cloned and the protein expressed and characterized. In an effort to identify novel secreted proteins, an algorithm incorporating neural network algorithms was applied to a large EST database. A full-length clone was identified that is 1710 bp in length and has a single open reading frame of 225 amino acids. This new cytokine is most homologous to cardiotrophin-1, having a similarity and an identity of 46 and 29%, respectively, and therefore we have named it cardiotrophin-like cytokine (CLC). Northern hybridization analysis identified a 1.4-kb messenger RNA that is highly expressed in spleen and peripheral leukocytes. Purified recombinant CLC induced the activation of NFkappaB and SRE reporter constructs in the TF-1, U937, and M1 cell lines. Furthermore, the signal transduction pathway for CLC was characterized in the neuroblastoma cell line SK-N-MC and found to involve tyrosine phosphorylation of gp130 and STAT-1. PMID:10448081

  16. GenSSI: a software toolbox for structural identifiability analysis of biological models

    Science.gov (United States)

    Chiş, Oana; Banga, Julio R.; Balsa-Canto, Eva

    2011-01-01

    Summary: Mathematical modeling has a key role in systems biology. Model building is often regarded as an iterative loop involving several tasks, among which the estimation of unknown parameters of the model from a certain set of experimental data is of central importance. This problem of parameter estimation has many possible pitfalls, and modelers should be very careful to avoid them. Many of such difficulties arise from a fundamental (yet often overlooked) property: the so-called structural (or a priori) identifiability, which considers the uniqueness of the estimated parameters. Obviously, the structural identifiability of any tentative model should be checked at the beginning of the model building loop. However, checking this property for arbitrary non-linear dynamic models is not an easy task. Here we present a software toolbox, GenSSI (Generating Series for testing Structural Identifiability), which enables non-expert users to carry out such analysis. The toolbox runs under the popular MATLAB environment and is accompanied by detailed documentation and relevant examples. Availability: The GenSSI toolbox and the related documentation are available at http://www.iim.csic.es/%7Egenssi. Contact: ebalsa@iim.csic.es PMID:21784792

  17. Using FAME Analysis to Compare, Differentiate, and Identify Multiple Nematode Species

    Science.gov (United States)

    Sekora, Nicholas S.; Agudelo, Paula; van Santen, Edzard; McInroy, John A.

    2009-01-01

    We have adapted the Sherlock® Microbial Identification system for identification of plant parasitic nematodes based on their fatty acid profiles. Fatty acid profiles of 12 separate plant parasitic nematode species have been determined using this system. Additionally, separate profiles have been developed for Rotylenchulus reniformis and Meloidogyne incognita based on their host plant, four species and three races within the Meloidogyne genus, and three life stages of Heterodera glycines. Statistically, 85% of these profiles can be delimited from one another; the specific comparisons between the cyst and vermiform stages of H. glycines, M. hapla and M. arenaria, and M. arenaria and M. javanica cannot be segregated using canonical analysis. By incorporating each of these fatty acid profiles into the Sherlock® Analysis Software, 20 library entries were created. While there was some similarity among profiles, all entries correctly identified the proper organism to genus, species, race, life stage, and host at greater than 86% accuracy. The remaining 14% were correctly identified to genus, although species and race may not be correct due to the underlying variables of host or life stage. These results are promising and indicate that this library could be used for diagnostics labs to increase response time. PMID:22736811

  18. Large-scale association analysis identifies new risk loci for coronary artery disease.

    Science.gov (United States)

    Deloukas, Panos; Kanoni, Stavroula; Willenborg, Christina; Farrall, Martin; Assimes, Themistocles L; Thompson, John R; Ingelsson, Erik; Saleheen, Danish; Erdmann, Jeanette; Goldstein, Benjamin A; Stirrups, Kathleen; König, Inke R; Cazier, Jean-Baptiste; Johansson, Asa; Hall, Alistair S; Lee, Jong-Young; Willer, Cristen J; Chambers, John C; Esko, Tõnu; Folkersen, Lasse; Goel, Anuj; Grundberg, Elin; Havulinna, Aki S; Ho, Weang K; Hopewell, Jemma C; Eriksson, Niclas; Kleber, Marcus E; Kristiansson, Kati; Lundmark, Per; Lyytikäinen, Leo-Pekka; Rafelt, Suzanne; Shungin, Dmitry; Strawbridge, Rona J; Thorleifsson, Gudmar; Tikkanen, Emmi; Van Zuydam, Natalie; Voight, Benjamin F; Waite, Lindsay L; Zhang, Weihua; Ziegler, Andreas; Absher, Devin; Altshuler, David; Balmforth, Anthony J; Barroso, Inês; Braund, Peter S; Burgdorf, Christof; Claudi-Boehm, Simone; Cox, David; Dimitriou, Maria; Do, Ron; Doney, Alex S F; El Mokhtari, NourEddine; Eriksson, Per; Fischer, Krista; Fontanillas, Pierre; Franco-Cereceda, Anders; Gigante, Bruna; Groop, Leif; Gustafsson, Stefan; Hager, Jörg; Hallmans, Göran; Han, Bok-Ghee; Hunt, Sarah E; Kang, Hyun M; Illig, Thomas; Kessler, Thorsten; Knowles, Joshua W; Kolovou, Genovefa; Kuusisto, Johanna; Langenberg, Claudia; Langford, Cordelia; Leander, Karin; Lokki, Marja-Liisa; Lundmark, Anders; McCarthy, Mark I; Meisinger, Christa; Melander, Olle; Mihailov, Evelin; Maouche, Seraya; Morris, Andrew D; Müller-Nurasyid, Martina; Nikus, Kjell; Peden, John F; Rayner, N William; Rasheed, Asif; Rosinger, Silke; Rubin, Diana; Rumpf, Moritz P; Schäfer, Arne; Sivananthan, Mohan; Song, Ci; Stewart, Alexandre F R; Tan, Sian-Tsung; Thorgeirsson, Gudmundur; van der Schoot, C Ellen; Wagner, Peter J; Wells, George A; Wild, Philipp S; Yang, Tsun-Po; Amouyel, Philippe; Arveiler, Dominique; Basart, Hanneke; Boehnke, Michael; Boerwinkle, Eric; Brambilla, Paolo; Cambien, Francois; Cupples, Adrienne L; de Faire, Ulf; Dehghan, Abbas; Diemert, Patrick; Epstein, Stephen E; Evans, Alun; Ferrario, Marco M; Ferrières, Jean; Gauguier, Dominique; Go, Alan S; Goodall, Alison H; Gudnason, Villi; Hazen, Stanley L; Holm, Hilma; Iribarren, Carlos; Jang, Yangsoo; Kähönen, Mika; Kee, Frank; Kim, Hyo-Soo; Klopp, Norman; Koenig, Wolfgang; Kratzer, Wolfgang; Kuulasmaa, Kari; Laakso, Markku; Laaksonen, Reijo; Lee, Ji-Young; Lind, Lars; Ouwehand, Willem H; Parish, Sarah; Park, Jeong E; Pedersen, Nancy L; Peters, Annette; Quertermous, Thomas; Rader, Daniel J; Salomaa, Veikko; Schadt, Eric; Shah, Svati H; Sinisalo, Juha; Stark, Klaus; Stefansson, Kari; Trégouët, David-Alexandre; Virtamo, Jarmo; Wallentin, Lars; Wareham, Nicholas; Zimmermann, Martina E; Nieminen, Markku S; Hengstenberg, Christian; Sandhu, Manjinder S; Pastinen, Tomi; Syvänen, Ann-Christine; Hovingh, G Kees; Dedoussis, George; Franks, Paul W; Lehtimäki, Terho; Metspalu, Andres; Zalloua, Pierre A; Siegbahn, Agneta; Schreiber, Stefan; Ripatti, Samuli; Blankenberg, Stefan S; Perola, Markus; Clarke, Robert; Boehm, Bernhard O; O'Donnell, Christopher; Reilly, Muredach P; März, Winfried; Collins, Rory; Kathiresan, Sekar; Hamsten, Anders; Kooner, Jaspal S; Thorsteinsdottir, Unnur; Danesh, John; Palmer, Colin N A; Roberts, Robert; Watkins, Hugh; Schunkert, Heribert; Samani, Nilesh J

    2013-01-01

    Coronary artery disease (CAD) is the commonest cause of death. Here, we report an association analysis in 63,746 CAD cases and 130,681 controls identifying 15 loci reaching genome-wide significance, taking the number of susceptibility loci for CAD to 46, and a further 104 independent variants (r(2) < 0.2) strongly associated with CAD at a 5% false discovery rate (FDR). Together, these variants explain approximately 10.6% of CAD heritability. Of the 46 genome-wide significant lead SNPs, 12 show a significant association with a lipid trait, and 5 show a significant association with blood pressure, but none is significantly associated with diabetes. Network analysis with 233 candidate genes (loci at 10% FDR) generated 5 interaction networks comprising 85% of these putative genes involved in CAD. The four most significant pathways mapping to these networks are linked to lipid metabolism and inflammation, underscoring the causal role of these activities in the genetic etiology of CAD. Our study provides insights into the genetic basis of CAD and identifies key biological pathways. PMID:23202125

  19. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    Science.gov (United States)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology

  20. Identifying past fire regimes throughout the Holocene in Ireland using new and established methods of charcoal analysis

    Science.gov (United States)

    Hawthorne, Donna; Mitchell, Fraser J. G.

    2016-04-01

    Globally, in recent years there has been an increase in the scale, intensity and level of destruction caused by wildfires. This can be seen in Ireland where significant changes in vegetation, land use, agriculture and policy, have promoted an increase in fires in the Irish landscape. This study looks at wildfire throughout the Holocene and draws on lacustrine charcoal records from seven study sites spread across Ireland, to reconstruct the past fire regimes recorded at each site. This work utilises new and accepted methods of fire history reconstruction to provide a recommended analytical procedure for statistical charcoal analysis. Digital charcoal counting was used and fire regime reconstructions carried out via the CharAnalysis programme. To verify this record new techniques are employed; an Ensemble-Member strategy to remove the objectivity associated with parameter selection, a Signal to Noise Index to determine if the charcoal record is appropriate for peak detection, and a charcoal peak screening procedure to validate the identified fire events based on bootstrapped samples. This analysis represents the first study of its kind in Ireland, examining the past record of fire on a multi-site and paleoecological timescale, and will provide a baseline level of data which can be built on in the future when the frequency and intensity of fire is predicted to increase.