WorldWideScience

Sample records for analysis techniques identifies

  1. Evaluation of energy system analysis techniques for identifying underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C. [and others

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  2. Social Learning Network Analysis Model to Identify Learning Patterns Using Ontology Clustering Techniques and Meaningful Learning

    Science.gov (United States)

    Firdausiah Mansur, Andi Besse; Yusof, Norazah

    2013-01-01

    Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…

  3. Neutron activation analysis techniques for identifying elemental status in Alzheimer's disease

    International Nuclear Information System (INIS)

    Brain tissue (hippocampus and cerebral cortex) from Alzheimer's disease and control individuals sampled from Eastern Canada and the United Kingdom were analyzed for Ag, Al, As, B, Br, Ca, Cd, Co, Cr, Cs, Cu, Fe, Hg, I, K, La, Mg, Mn, Mo, Ni, Rb, S, Sb, Sc, Se, Si, Sn, Sr, Ti, V and Zn. Neutron activation analysis (thermal and prompt gamma-ray) methods were used. Very highly significant differences (S**: probability less than 0.005) for both study areas were shown between Alzheimer's disease (AD) and control (C) individuals: AD>C for Al, Br, Ca and S, and AD< C for Se, V and Zn. Aluminium content of brain tissue ranged form 3.605 to 21.738 μg/g d.w. (AD) and 0.379 to 4.768 μg/g d.w. (C). No statistical evidence of aluminium accumulation with age was noted. Possible zinc deficiency (especially for hippocampal tissue), was observed with zinc ranges of 31.42 to 57.91 μg/g d.w. (AD) and 37.31 to 87.10 μg/g d.w. (C), for Alzheimer's disease patients. (author)

  4. Application of Principal Component Analysis to NIR Spectra of Phyllosilicates: A Technique for Identifying Phyllosilicates on Mars

    Science.gov (United States)

    Rampe, E. B.; Lanza, N. L.

    2012-01-01

    Orbital near-infrared (NIR) reflectance spectra of the martian surface from the OMEGA and CRISM instruments have identified a variety of phyllosilicates in Noachian terrains. The types of phyllosilicates present on Mars have important implications for the aqueous environments in which they formed, and, thus, for recognizing locales that may have been habitable. Current identifications of phyllosilicates from martian NIR data are based on the positions of spectral absorptions relative to laboratory data of well-characterized samples and from spectral ratios; however, some phyllosilicates can be difficult to distinguish from one another with these methods (i.e. illite vs. muscovite). Here we employ a multivariate statistical technique, principal component analysis (PCA), to differentiate between spectrally similar phyllosilicate minerals. PCA is commonly used in a variety of industries (pharmaceutical, agricultural, viticultural) to discriminate between samples. Previous work using PCA to analyze raw NIR reflectance data from mineral mixtures has shown that this is a viable technique for identifying mineral types, abundances, and particle sizes. Here, we evaluate PCA of second-derivative NIR reflectance data as a method for classifying phyllosilicates and test whether this method can be used to identify phyllosilicates on Mars.

  5. MALDI-TOF and SELDI-TOF analysis: “tandem” techniques to identify potential biomarker in fibromyalgia

    Directory of Open Access Journals (Sweden)

    A. Lucacchini

    2011-11-01

    Full Text Available Fibromyalgia (FM is characterized by the presence of chronic widespread pain throughout the musculoskeletal system and diffuse tenderness. Unfortunately, no laboratory tests have been appropriately validated for FM and correlated with the subsets and activity. The aim of this study was to apply a proteomic technique in saliva of FM patients: the Surface Enhance Laser Desorption/Ionization Time-of-Flight (SELDI-TOF. For this study, 57 FM patients and 35 HC patients were enrolled. The proteomic analysis of saliva was carried out using SELDI-TOF. The analysis was performed using different chip arrays with different characteristics of binding. The statistical analysis was performed using cluster analysis and the difference between two groups was underlined using Student’s t-test. Spectra analysis highlighted the presence of several peaks differently expressed in FM patients compared with controls. The preliminary results obtained by SELDI-TOF analysis were compared with those obtained in our previous study performed on whole saliva of FM patients by using electrophoresis. The m/z of two peaks, increased in FM patients, seem to overlap well with the molecular weight of calgranulin A and C and Rho GDP-dissociation inhibitor 2, which we had found up-regulated in our previous study. These preliminary results showed the possibility of identifying potential salivary biomarker through salivary proteomic analysis with MALDI-TOF and SELDI-TOF in FM patients. The peaks observed allow us to focus on some of the particular pathogenic aspects of FM, the oxidative stress which contradistinguishes this condition, the involvement of proteins related to the cytoskeletal arrangements, and central sensibilization.

  6. Identifying Major Techniques of Persuasion.

    Science.gov (United States)

    Makosky, Vivian Parker

    1985-01-01

    The purpose of this class exercise is to increase undergraduate psychology students' awareness of common persuasion techniques used in advertising, including the appeal to or creation of needs, social and prestige suggestion, and the use of emotionally loaded words and images. Television commercials and magazine advertisements are used as…

  7. Application of gene network analysis techniques identifies AXIN1/PDIA2 and endoglin haplotypes associated with bicuspid aortic valve.

    Directory of Open Access Journals (Sweden)

    Eric C Wooten

    Full Text Available Bicuspid Aortic Valve (BAV is a highly heritable congenital heart defect. The low frequency of BAV (1% of general population limits our ability to perform genome-wide association studies. We present the application of four a priori SNP selection techniques, reducing the multiple-testing penalty by restricting analysis to SNPs relevant to BAV in a genome-wide SNP dataset from a cohort of 68 BAV probands and 830 control subjects. Two knowledge-based approaches, CANDID and STRING, were used to systematically identify BAV genes, and their SNPs, from the published literature, microarray expression studies and a genome scan. We additionally tested Functionally Interpolating SNPs (fitSNPs present on the array; the fourth consisted of SNPs selected by Random Forests, a machine learning approach. These approaches reduced the multiple testing penalty by lowering the fraction of the genome probed to 0.19% of the total, while increasing the likelihood of studying SNPs within relevant BAV genes and pathways. Three loci were identified by CANDID, STRING, and fitSNPS. A haplotype within the AXIN1-PDIA2 locus (p-value of 2.926x10(-06 and a haplotype within the Endoglin gene (p-value of 5.881x10(-04 were found to be strongly associated with BAV. The Random Forests approach identified a SNP on chromosome 3 in association with BAV (p-value 5.061x10(-06. The results presented here support an important role for genetic variants in BAV and provide support for additional studies in well-powered cohorts. Further, these studies demonstrate that leveraging existing expression and genomic data in the context of GWAS studies can identify biologically relevant genes and pathways associated with a congenital heart defect.

  8. Identifying learning techniques among high achievers

    OpenAIRE

    Shanmukananda P; L. Padma

    2013-01-01

    Background: In every college, it is noticed that in spite of being exposed to the same teaching modalities and adopting seemingly similar strategies, some students perform much better than their peers. This can be evaluated in the form of better academic performance in the internal assessments they undertake. This project is an endeavor to identify the learning techniques among high achievers which they employ to outperform others. We can also suggest the same to the medium and low achievers ...

  9. Identifying learning techniques among high achievers

    Directory of Open Access Journals (Sweden)

    Shanmukananda P

    2013-04-01

    Full Text Available Background: In every college, it is noticed that in spite of being exposed to the same teaching modalities and adopting seemingly similar strategies, some students perform much better than their peers. This can be evaluated in the form of better academic performance in the internal assessments they undertake. This project is an endeavor to identify the learning techniques among high achievers which they employ to outperform others. We can also suggest the same to the medium and low achievers so that they can improve their academic performance. This study was conducted to identify different learning techniques adopted by high achievers and suggesting the same to medium and low achievers. Methods: After obtaining clearance from the institutional ethics committee, the high achievers were identified by selecting the upper third of the students in the ascending order of marks obtained in the consecutive three internal assessments in three consecutive batches. The identity of the students was not revealed. They were then administered an open ended questionnaire which addressed relevant issues. The most common and feasible techniques will be suggested to the medium and low achievers. Results: The respondents’ (n=101 replies were analyzed by calculating the percentages of responses, and assessing based on that, which were the most frequently adapted techniques by these high achievers Conclusions: High-achievers have a diligent study pattern; they not only study regularly, but also involve in group discussions and approach their teachers when in doubt. Additionally, they refer to other sources of information like the internet, demonstrating a proactive attitude towards studies. [Int J Basic Clin Pharmacol 2013; 2(2.000: 203-207

  10. Nuclear techniques to identify allergenic metals in orthodontic brackets

    International Nuclear Information System (INIS)

    The present study determines the elementary alloy composition of ten commercial brands of brackets, especially related to Ni, Cr, and Co metals, confirmed allergenic elements. The nuclear techniques applied in the analyses were X-ray fluorescence (XRF) - Centre National de la Recherche Scientifique, France (National Center of Scientific Research), and X-ray energy spectrometry (XRES), and Instrumental Neutron Activation Analysis (INAA) - CDTN/CNEN, Brazil. The XRES and XRF techniques identified Cr in the 10 samples analyzed and Ni in eight samples. The INAA technique identified the presence of Cr (14% to 19%) and Co (42% to 2400 ppm) in all samples. The semi-quantitative analysis performed by XRF also identified Co in two samples. The techniques were effective in the identification of metals in orthodontic brackets. The elements identified in this study can be considered one of the main reason for the allergic processes among the patients studied. This finding suggests that the patients should be tested for allergy and allergenic sensibility to metals prior to the prescription of orthodontic device. (author)

  11. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  12. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  13. New technique for identifying varieties resistance to rice blast

    Institute of Scientific and Technical Information of China (English)

    ZHUPeiliang

    1994-01-01

    After 8 yrs lab experiments and field tests, an advanced technique for identifying varieties resistance to rice blast was developed by a research group in Plant Protection Institute, Zhejiang Academy of AgricuLltural Sciences. With this technique, the inoculum was prepared on a maizc-rice-straw-agar media which was suitable for sporulation of most rice blast pathogen isolates.

  14. Identifying appropriate tasks for the preregistration year: modified Delphi technique

    OpenAIRE

    Stewart, J.; O'Halloran, Cath; Harrigan, P; Spencer, J; Barton, J. R.; Singleton, S. J.

    1999-01-01

    Objectives: To identify the tasks that should constitute the work of preregistration house officers to provide the basis for the development of a self evaluation instrument. Design: Literature review and modified Delphi technique. Setting: Northern Deanery within the Northern and Yorkshire office NHS executive. Subjects: 67 educational supervisors of preregistration house officers. Main outcome measures: Percentage of agreement by educational supervisors to tasks identified fr...

  15. Identifying fly puparia by clearing technique: application to forensic entomology.

    Science.gov (United States)

    Sukontason, Kabkaew L; Ngern-Klun, Radchadawan; Sripakdee, Duanghatai; Sukontason, Kom

    2007-10-01

    In forensic investigations, immature stages of the fly (egg, larva, or puparia) can be used as entomological evidence at death scenes, not only to estimate the postmortem interval (PMI), analyze toxic substances, and to determine the manner of death but also to indicate the movement of a corpse in homicide cases. Of these immature stages, puparia represent the longest developmental time, which makes them of useful. However, in order for forensic entomologists to use puparia effectively, it is crucial that they are able to accurately identify the species of fly found in a corpse. Typically, these puparia are similar in general appearance, being coarctate and light brown to dark brown in color, which makes identification difficult. In this study, we report on the clearing technique used to pale the integument of fly puparia, thereby allowing observation of the anterior end (second to fourth segments) and the profile of the posterior spiracle, which are important clues for identification. We used puparia of the blowfly, Chrysomya megacephala (F.), as the model species in this experiment. With placement in a 20% potassium hydroxide solution daily and mounting on a clearing medium (Permount(R), New Jersey), the profile of the posterior spiracle could be clearly examined under a light microscope beginning on the fifth day after pupation, and the number of papillae in the anterior spiracle could be counted easily starting from the ninth day. Comparison of morphological features of C. megacephala puparia with those of other blowflies (Chrysomya nigripes [Aubertin], Chrysomya rufifacies [Macquart], Chrysomya villeneuvi [Patton], Lucilia cuprina [Wiedemann], and Hemipyrellia ligurriens [Wiedemann]) and a housefly (Musca domestica L.) revealed that the anterior ends and the profiles of the posterior spiracles had markedly distinguishing characteristics. Morphometric analysis of the length and width of puparia, along with the length of the gaps between the posterior spiracles

  16. Image Techniques for Identifying Sea-Ice Parameters

    Directory of Open Access Journals (Sweden)

    Qin Zhang

    2014-10-01

    Full Text Available The estimation of ice forces are critical to Dynamic Positioning (DP operations in Arctic waters. Ice conditions are important for the analysis of ice-structure interaction in an ice field. To monitor sea-ice conditions, cameras are used as field observation sensors on mobile sensor platforms in Arctic. Various image processing techniques, such as Otsu thresholding, k-means clustering, distance transform, Gradient Vector Flow (GVF Snake, mathematical morphology, are then applied to obtain ice concentration, ice types, and floe size distribution from sea-ice images to ensure safe operations of structures in ice covered regions. Those techniques yield acceptable results, and their effectiveness are demonstrated in case studies.

  17. DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Food scientists use standards and calibrations to relate the concentration of a compound of interest to the instrumental response. The techniques used include classical, single point, and inverse calibrations, as well as standard addition and internal standards. Several fundamental criteria -- sel...

  18. Review on Identify Kin Relationship Technique in Image

    Directory of Open Access Journals (Sweden)

    Deepak M Ahire

    2015-06-01

    Full Text Available In this paper work Kin relationships are traditionally defined as ties based on blood . Kinship include lineal generational bonds like children, parents, grandparents, and great-grandparents, collateral bonds such as siblings, cousins, nieces and nephews, and aunts and uncles, and ties with in-laws. An often-made distinction is that between primary kin members of the families of origin and procreation and secondary kin other family members. The former refer to as “immediate family,” and the latter are generally labelled “extended family.” Marriage, as a principle of kinship, differs from blood in that it can be terminated. Here Proposing the technique to identify Kin relationship System or Kinship model by using face recognition technique splitting the face into subsets like forehead, eyes, nose, mouth, and cheek areas constitute through Gabor Features on available Real time Database. Given the potential for marital break-up, blood is recognized as the more important principle of kinship.

  19. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  20. Technique for identifying, tracing, or tracking objects in image data

    Science.gov (United States)

    Anderson, Robert J.; Rothganger, Fredrick

    2012-08-28

    A technique for computer vision uses a polygon contour to trace an object. The technique includes rendering a polygon contour superimposed over a first frame of image data. The polygon contour is iteratively refined to more accurately trace the object within the first frame after each iteration. The refinement includes computing image energies along lengths of contour lines of the polygon contour and adjusting positions of the contour lines based at least in part on the image energies.

  1. Efficiency of different techniques to identify changes in land use

    Science.gov (United States)

    Zornoza, Raúl; Mateix-Solera, Jorge; Gerrero, César

    2013-04-01

    The need for the development of sensitive and efficient methodologies for soil quality evaluation is increasing. The ability to assess soil quality and identify key soil properties that serve as indicators of soil function is complicated by the multiplicity of physical, chemical and biological factors that control soil processes. In the mountain region of the Mediterranean Basin of Spain, almond trees have been cultivated in terraced orchards for centuries. These crops are immersed in the Mediterranean forest scenery, configuring a mosaic landscape where orchards are integrated in the forest masses. In the last decades, almond orchards are being abandoned, leading to an increase in vegetation cover, since abandoned fields are naturally colonized by the surrounded natural vegetation. Soil processes and properties are expected to be associated with vegetation successional dynamics. Thus, the establishment of suitable parameters to monitor soil quality related to land use changes is particularly important to guarantee the regeneration of the mature community. In this study, we selected three land uses, constituted by forest, almond trees orchards, and orchards abandoned between 10 and 15 years previously to sampling. Sampling was carried out in four different locations in SE Spain. The main purpose was to evaluate if changes in management have significantly influenced different sets of soil characteristics. For this purpose, we used a discriminant analysis (DA). The different sets of soil characteristics tested in this study were 1: physical, chemical and biochemical properties; 2: soil near infrared (NIR) spectra; and 3: phospholipid fatty acids (PLFAs). After the DA performed with the sets 1 and 2, the three land uses were clearly separated by the two first discriminant functions, and more than 85 % of the samples were correctly classified (grouped). Using the sets 3 and 4 for DA resulted in a slightly better separation of land uses, being more than 85% of the

  2. A first countercheck trial to identify irradiated spices with luminescence techniques

    International Nuclear Information System (INIS)

    The Federal Health Office, in collaboration with 4 institutions responsible for food control analysis and 2 research facilities, for the first time conducted a countercheck trial to identify irradiated spices. This test was mainly intended to find out whether chemiluminescence (CL) and thermoluminescence (TL) techniques are appropriate for this purpose. Nine different spices were selected. Approximately 85% of the samples may be identified correctly if both methods are used. However, only 3% of all spices subjected to CL analysis were falsely identified as irradiated, and the proportion of falsely positive results is even lower than 1% if only those spices are considered for which CL analysis is well suited. If an additional procedure, such as TL, is applied, it is highly probable that false identification of non-irradiated samples may be excluded. (orig./PW)

  3. Digital Fourier analysis advanced techniques

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to advanced digital Fourier analysis for advanced undergraduate and graduate students. Assuming knowledge of the Fast Fourier Transform, this book covers advanced topics including the Hilbert transform, cepstrum analysis, and the two-dimensional Fourier transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Advanced Techniques" includes practice problems and thorough Appendices. As a central feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. The applet source code in Visual Basic is provided online, enabling advanced students to tweak and change the programs for more sophisticated results. A complete, intuitive guide, "Digital Fourier Analysis - Advanced Techniques" is an essential reference for students in science and engineering.

  4. A saltwater flotation technique to identify unincubated eggs

    Science.gov (United States)

    Devney, C.A.; Kondrad, S.L.; Stebbins, K.R.; Brittingham, K.D.; Hoffman, D.J.; Heinz, G.H.

    2009-01-01

    Field studies on nesting birds sometimes involve questions related to nest initiation dates, length of the incubation period, or changes in parental incubation behavior during various stages of incubation. Some of this information can be best assessed when a nest is discovered before the eggs have undergone any incubation, and this has traditionally been assessed by floating eggs in freshwater. However, because the freshwater method is not particularly accurate in identifying unincubated eggs, we developed a more reliable saltwater flotation method. The saltwater method involves diluting a saturated saltwater solution with freshwater until a salt concentration is reached where unincubated eggs sink to the bottom and incubated eggs float to the surface. For Laughing Gulls (Leucophaeus atricilla), floating eggs in freshwater failed to identify 39.0% (N = 251) of eggs that were subsequently found by candling to have undergone incubation prior to collection. By contrast, in a separate collection of gull eggs, no eggs that passed the saltwater test (N = 225) were found by a later candling to have been incubated prior to collection. For Double-crested Cormorants (Phalacrocorax auritus), floating eggs in freshwater failed to identify 15.6% (N = 250) of eggs that had undergone incubation prior to collection, whereas in a separate collection, none of the eggs that passed the saltwater test (N = 85) were found by a later candling to have been incubated prior to collection. Immersion of eggs in saltwater did not affect embryo survival. Although use of the saltwater method is likely limited to colonial species and requires calibrating a saltwater solution, it is a faster and more accurate method of identifying unincubated eggs than the traditional method of floating eggs in freshwater.

  5. Identifying irradiated flours by photo-stimulated luminescence technique

    Science.gov (United States)

    Ramli, Ros Anita Ahmad; Yasir, Muhamad Samudi; Othman, Zainon; Abdullah, Wan Saffiey Wan

    2014-02-01

    Photo-stimulated luminescence (PSL) technique was used in this study to detect gamma irradiation treatment of five types of flours (corn, rice, tapioca, wheat and glutinous rice) at four different doses 0, 0.2, .05 and 1kGy. The signal level was compared with two threshold values (700 and 5000). With the exception of glutinous rice, all irradiated samples produced a strong signal above the upper threshold (5000 counts/60s). All control samples produced negative result with the signals below the lower threshold (700 counts/60s) suggesting that the samples have not been irradiated. Irradiated glutinous rice samples produced intermediate signals (700 - 5000 counts/60s) which were subsequently confirmed using calibrated PSL. The PSL signals remained stable after 90 days of storage. The findings of this study will be useful to facilitate control of food irradiation application in Malaysia.

  6. Identifying irradiated flours by photo-stimulated luminescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Ramli, Ros Anita Ahmad; Yasir, Muhamad Samudi [Faculty of Science and Technology, National University of Malaysia, Bangi, 43000 Kajang, Selangor (Malaysia); Othman, Zainon; Abdullah, Wan Saffiey Wan [Malaysian Nuclear Agency, Bangi 43000 Kajang, Selangor (Malaysia)

    2014-02-12

    Photo-stimulated luminescence (PSL) technique was used in this study to detect gamma irradiation treatment of five types of flours (corn, rice, tapioca, wheat and glutinous rice) at four different doses 0, 0.2, .05 and 1kGy. The signal level was compared with two threshold values (700 and 5000). With the exception of glutinous rice, all irradiated samples produced a strong signal above the upper threshold (5000 counts/60s). All control samples produced negative result with the signals below the lower threshold (700 counts/60s) suggesting that the samples have not been irradiated. Irradiated glutinous rice samples produced intermediate signals (700 - 5000 counts/60s) which were subsequently confirmed using calibrated PSL. The PSL signals remained stable after 90 days of storage. The findings of this study will be useful to facilitate control of food irradiation application in Malaysia.

  7. Factor analysis identifies subgroups of constipation

    Institute of Scientific and Technical Information of China (English)

    Philip G Dinning; Mike Jones; Linda Hunt; Sergio E Fuentealba; Jamshid Kalanter; Denis W King; David Z Lubowski; Nicholas J Talley; Ian J Cook

    2011-01-01

    AIM: To determine whether distinct symptom groupings exist in a constipated population and whether such grouping might correlate with quantifiable pathophysiological measures of colonic dysfunction. METHODS: One hundred and ninety-one patients presenting to a Gastroenterology clinic with constipation and 32 constipated patients responding to a newspaper advertisement completed a 53-item, wide-ranging selfreport questionnaire. One hundred of these patients had colonic transit measured scintigraphically. Factor analysis determined whether constipation-related symptoms grouped into distinct aspects of symptomatology. Cluster analysis was used to determine whether individual patients naturally group into distinct subtypes. RESULTS: Cluster analysis yielded a 4 cluster solution with the presence or absence of pain and laxative unresponsiveness providing the main descriptors. Amongst all clusters there was a considerable proportion of patients with demonstrable delayed colon transit, irritable bowel syndrome positive criteria and regular stool frequency. The majority of patients with these characteristics also reported regular laxative use. CONCLUSION: Factor analysis identified four constipation subgroups, based on severity and laxative unresponsiveness, in a constipated population. However, clear stratification into clinically identifiable groups remains imprecise.

  8. Study of techniques of identifying the earthquake precursory anomalies in terms of mathematical modeling

    Institute of Scientific and Technical Information of China (English)

    YAN Zun-guo; QIAN Jia-dong; CHEN Jun-hua; LI Sheng-le

    2000-01-01

    This paper deals mainly with the key technique of identifying the anomalous signals without distortion, which might be the precursors associated with earthquakes, from the real time series of observations that would be usually the mixture of the anomalous signals, the normal background variations, some interference and noises. The key technique of 2 un-biased estimation2 is to construct an empirical time series and set up the criterion for identifying the anomalous variation on the bases of time series analysis. To the end of testing the method, a man-made time series including the normal variations and random interference as well as specific anomaly, has been constructed. And the test of picking up the anomaly has been conducted with the intuitive and effective way of identifying the anomalous signal from a complicated time series. Test results confirms that the techniques under discussion are effective and applicable, and the signals extracted from the analysis, could be clear and precise, and is almost similar to the known simulated anomalous signals in the experiments.

  9. The application of data mining techniques in analysis the stock portfolio in order to identify common patterns in the behavior of shareholders (Case study of selected brokers in Mazandaran province)

    OpenAIRE

    GHASEM SOLTANLO, Sara Saadati; SADRABADI, Alireza Naser

    2015-01-01

    In this study, we examined the analysis of stock portfolio stakeholders in order to identify common patterns in the behavior of shareholders. Information required about the shareholders shopping cart was collected from the selected brokers in Mazandaran province / Sari city. This information includes demographic information, such as gender, occupation and education, along with a basket of shares purchased during 2013. Data were collected for 150 shares that during this period have traded at l...

  10. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  11. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  12. Lidar point density analysis: implications for identifying water bodies

    Science.gov (United States)

    Worstell, Bruce B.; Poppenga, Sandra; Evans, Gayla A.; Prince, Sandra

    2014-01-01

    Most airborne topographic light detection and ranging (lidar) systems operate within the near-infrared spectrum. Laser pulses from these systems frequently are absorbed by water and therefore do not generate reflected returns on water bodies in the resulting void regions within the lidar point cloud. Thus, an analysis of lidar voids has implications for identifying water bodies. Data analysis techniques to detect reduced lidar return densities were evaluated for test sites in Blackhawk County, Iowa, and Beltrami County, Minnesota, to delineate contiguous areas that have few or no lidar returns. Results from this study indicated a 5-meter radius moving window with fewer than 23 returns (28 percent of the moving window) was sufficient for delineating void regions. Techniques to provide elevation values for void regions to flatten water features and to force channel flow in the downstream direction also are presented.

  13. Use of Photogrammetry and Biomechanical Gait analysis to Identify Individuals

    DEFF Research Database (Denmark)

    Larsen, Peter Kastmand; Simonsen, Erik Bruun; Lynnerup, Niels

    found. Especially the variables located in the frontal plane are interesting due to large inter-individual differences in time course patterns. The variables with high recognition rates seem preferable for use in forensic gait analysis and as input variables to waveform analysis techniques...... such as principal component analysis resulting in marginal scores, which are difficult to interpret individually. Finally, a new gait model is presented based on functional principal component analysis with potentials for detecting individual gait patterns where time course patterns can be marginally interpreted......Photogrammetry and recognition of gait patterns are valuable tools to help identify perpetrators based on surveillance recordings. We have found that stature but only few other measures have a satisfying reproducibility for use in forensics. Several gait variables with high recognition rates were...

  14. Análise comparativa de fragmentos identificáveis de forrageiras, pela técnica micro-histológica Comparative analysis of identifiable fragments of forages, by the microhistological technique

    Directory of Open Access Journals (Sweden)

    Maristela de Oliveira Bauer

    2005-12-01

    Full Text Available Objetivou-se, com este trabalho, verificar, pela técnica micro-histológica, diferenças entre espécies forrageiras quanto ao percentual de fragmentos identificáveis, em função do processo digestivo e da época do ano. Lâminas foliares frescas recém-expandidas, correspondentes à última e à penúltima posição no perfilho, das espécies Melinis minutiflora Pal. de Beauv (capim-gordura, Hyparrhenia rufa (Nees Stapf. (capim-jaraguá, Brachiaria decumbens Stapf. (capim-braquiária, Imperata brasiliensis Trin. (capim-sapé, de Medicago sativa L. (alfafa e de Schinus terebenthifolius Raddi (aroeira, amostradas nos períodos chuvoso e seco, foram digeridas in vitro e preparadas de acordo com a técnica micro-histológica. Observou-se que as espécies apresentaram diferenças marcantes na porcentagem de fragmentos identificáveis e que a digestão alterou estas porcentagens em torno de 10 %; que o período de amos­tragem não influenciou a porcentagem de fragmentos identificáveis para a maioria das espécies; que a presença de pigmentos e a adesão da epiderme às células dos tecidos internos da folha prejudicaram a identificação dos fragmentos; e que a digestão melhorou a visualização dos fragmentos dos capins sapé e jaraguá e da aroeira, mas prejudicou a do capim-braquiária e, principalmente, a da alfafa.The objetive of this study was to verify differences among forages species in relation to the percentage of identifiable fragment as affected by the digestion process and season. Fresh last expanded leaf lamina samples of the species Melinis minutiflora Pal. de Beauv (Molassesgrass, Hyparrhenia rufa (Nees Stapf. (Jaraguagrass, Brachiaria decumbens Stapf. (Signalgrass, Imperata brasilienses Trin. (Sapegrass, and foliar laminas of Medicago sativa L. (Alfalfa and Schinus terebenthifolius Raddi (Aroeira, sampled in the rainy and dry seasons, were digested in vitro and prepared according to the microhistological technique. The

  15. Identifying the sources of produced water in the oil field by isotopic techniques

    International Nuclear Information System (INIS)

    The objective of this study is to identify the sources of the formation water in the Southwest Su Tu Den (STD SW) basement reservoir. To achieve the objective, isotopic techniques along with geochemical analysis for chloride, bromide, strontium dissolved in the water were applied. The isotopic techniques used in this study were the determination of water stable isotopes signatures (δ2H and δ18O) and of the 87Sr/86Sr ratio of strontium in rock cutting sample and that dissolved in the formation water. The obtained results showed that the stable isotopes compositions of water in the Lower Miocene was -3‰ and -23‰ for δ18O and δ2H, respectively indicating the primeval nature of seawater in the reservoir. Meanwhile, the isotopic composition of water in the basement was clustered in a range of alternated freshwater with δ18O and δ2H being -(3-4)‰ and -(54-60)‰, respectively). The strontium isotopes ratio for water in the Lower Miocene reservoir was lower compared to that for water in the basement confirming the different natures of the water in the two reservoirs. The obtained results are assured for the techniques applicability, and it is recommended that studies on identification of the flow-path of the formation water in the STD SW basement reservoir should be continued. (author)

  16. Algorithms Design Techniques and Analysis

    CERN Document Server

    Alsuwaiyel, M H

    1999-01-01

    Problem solving is an essential part of every scientific discipline. It has two components: (1) problem identification and formulation, and (2) solution of the formulated problem. One can solve a problem on its own using ad hoc techniques or follow those techniques that have produced efficient solutions to similar problems. This requires the understanding of various algorithm design techniques, how and when to use them to formulate solutions and the context appropriate for each of them. This book advocates the study of algorithm design techniques by presenting most of the useful algorithm desi

  17. Identifiable Data Files - Medicare Provider Analysis and ...

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Provider Analysis and Review (MEDPAR) File contains data from claims for services provided to beneficiaries admitted to Medicare certified inpatient...

  18. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  19. Adhesive polypeptides of Staphylococcus aureus identified using a novel secretion library technique in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Holm Liisa

    2011-05-01

    Full Text Available Abstract Background Bacterial adhesive proteins, called adhesins, are frequently the decisive factor in initiation of a bacterial infection. Characterization of such molecules is crucial for the understanding of bacterial pathogenesis, design of vaccines and development of antibacterial drugs. Because adhesins are frequently difficult to express, their characterization has often been hampered. Alternative expression methods developed for the analysis of adhesins, e.g. surface display techniques, suffer from various drawbacks and reports on high-level extracellular secretion of heterologous proteins in Gram-negative bacteria are scarce. These expression techniques are currently a field of active research. The purpose of the current study was to construct a convenient, new technique for identification of unknown bacterial adhesive polypeptides directly from the growth medium of the Escherichia coli host and to identify novel proteinaceous adhesins of the model organism Staphylococcus aureus. Results Randomly fragmented chromosomal DNA of S. aureus was cloned into a unique restriction site of our expression vector, which facilitates secretion of foreign FLAG-tagged polypeptides into the growth medium of E. coli ΔfliCΔfliD, to generate a library of 1663 clones expressing FLAG-tagged polypeptides. Sequence and bioinformatics analyses showed that in our example, the library covered approximately 32% of the S. aureus proteome. Polypeptides from the growth medium of the library clones were screened for binding to a selection of S. aureus target molecules and adhesive fragments of known staphylococcal adhesins (e.g coagulase and fibronectin-binding protein A as well as polypeptides of novel function (e.g. a universal stress protein and phosphoribosylamino-imidazole carboxylase ATPase subunit were detected. The results were further validated using purified His-tagged recombinant proteins of the corresponding fragments in enzyme-linked immunoassay and

  20. Identifying MMORPG Bots: A Traffic Analysis Approach

    Directory of Open Access Journals (Sweden)

    Wen-Chin Chen

    2008-11-01

    Full Text Available Massively multiplayer online role playing games (MMORPGs have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1 the regularity in the release time of client commands, 2 the trend and magnitude of traffic burstiness in multiple time scales, and 3 the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  1. Identifying MMORPG Bots: A Traffic Analysis Approach

    Science.gov (United States)

    Chen, Kuan-Ta; Jiang, Jhih-Wei; Huang, Polly; Chu, Hao-Hua; Lei, Chin-Laung; Chen, Wen-Chin

    2008-12-01

    Massively multiplayer online role playing games (MMORPGs) have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1) the regularity in the release time of client commands, 2) the trend and magnitude of traffic burstiness in multiple time scales, and 3) the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  2. Testing the potential of geochemical techniques for identifying hydrological systems within landslides in partly weathered marls

    Science.gov (United States)

    Bogaard, T. A.; Buma, J. T.; Klawer, C. J. M.

    2004-03-01

    This paper's objective is to determine how useful geochemistry can be in landslide investigations. More specifically, what additional information can be gained by analysing the cation exchange capacity (CEC) and cation composition in respect to the hydrological system of a landslide area in clayey material. Two cores from the Boulc-Mondorès landslide (France) and one core from the Alvera landslide (Italy) were analysed. The NH 4Ac and NaCl laboratory techniques are tested. The geochemical results are compared with the core descriptions and interpreted with respect to their usefulness. Both analysis techniques give identical results for CEC, and are plausible on the basis of the available clay content information. The determination of the exchangeable cations was more difficult, since part of the marls dissolved. With the ammonium-acetate method more of the marls are dissolved than with the sodium-chloride method. The NaCl method is preferred for the determination of the cation fractions at the complex, be it that this method has the disadvantage that the sodium fraction cannot be determined. To overcome this problem, it is recommended to try other displacement fluids. In the Boulc-Mondorès example, the subsurface information that can be extracted from CEC analyses was presented. In the Boulc-Mondorès cores deviant intervals of CEC could be identified. These are interpreted as weathered layers (and preferential flow paths) that may develop or have already developed into slip surfaces. The major problem of the CEC analyses was to explain the origin of the differences found in the core samples. Both Alvera and Boulc-Mondorès examples show transitions in cation composition with depth. It was shown that the exchangeable caution fractions can be useful in locating boundaries between water types, especially the boundary between the superficial, rain-fed hydrological system and the lower, regional groundwater system. This information may be important for landslide

  3. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  4. Use of discriminant analysis to identify propensity for purchasing properties

    Directory of Open Access Journals (Sweden)

    Ricardo Floriani

    2015-03-01

    Full Text Available Properties usually represent a milestone for people and families due to the high added-value when compared with family income. The objective of this study is the proposition of a discrimination model, by a discriminant analysis of people with characteristics (according to independent variables classified as potential buyers of properties, as well as to identify the interest in the use of such property, if it will be assigned to housing or leisure activities such as a cottage or beach house, and/or for investment. Thus, the following research question is proposed: What are the characteristics that better describe the profile of people which intend to acquire properties? The study justifies itself by its economic relevance in the real estate industry, as well as to the players of the real estate Market that may develop products based on the profile of potential customers. As a statistical technique, discriminant analysis was applied to the data gathered by questionnaire, which was sent via e-mail. Three hundred and thirty four responses were gathered. Based on this study, it was observed that it is possible to identify the intention for acquired properties, as well the purpose for acquiring it, for housing or investments.

  5. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  6. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  7. Gradient measurement technique to identify phase transitions in nano-dispersed liquid crystalline compounds

    Science.gov (United States)

    Pardhasaradhi, P.; Madhav, B. T. P.; Venugopala Rao, M.; Manepalli, R. K. N. R.; Pisipati, V. G. K. M.

    2016-09-01

    Characterization and phase transitions in pure and 0.5% BaTiO3 nano-dispersed liquid crystalline (LC) N-(p-n-heptyloxybenzylidene)-p-n-nonyloxy aniline, 7O.O9, com-pounds are carried out using a polarizing microscope attached with hot stage and camera. We observed that when any of these images are distorted, different local structures suffer from various degradations in a gradient magnitude. So, we examined the pixel-wise gradient magnitude similarity between the reference and distorted images combined with a novel pooling strategy - the standard deviation of the GMS map - to determine the overall phase transition variations. In this regard, MATLAB software is used for gradient measurement technique to identify the phase transitions and transition temperature of the pure and nano-dispersed LC compounds. The image analysis of this method proposed is in good agreement with the standard methods like polarizing microscope (POM) and differential scanning calorimeter (DSC). 0.5% BaTiO3 nano-dispersed 7O.O9 compound induces cholesteric phase quenching the nematic phase, which the pure compound exhibits.

  8. Identifying desertification risk areas using fuzzy membership and geospatial technique - A case study, Kota District, Rajasthan

    Science.gov (United States)

    Dasgupta, Arunima; Sastry, K. L. N.; Dhinwa, P. S.; Rathore, V. S.; Nathawat, M. S.

    2013-08-01

    Desertification risk assessment is important in order to take proper measures for its prevention. Present research intends to identify the areas under risk of desertification along with their severity in terms of degradation in natural parameters. An integrated model with fuzzy membership analysis, fuzzy rule-based inference system and geospatial techniques was adopted, including five specific natural parameters namely slope, soil pH, soil depth, soil texture and NDVI. Individual parameters were classified according to their deviation from mean. Membership of each individual values to be in a certain class was derived using the normal probability density function of that class. Thus if a single class of a single parameter is with mean μ and standard deviation σ, the values falling beyond μ + 2 σ and μ - 2 σ are not representing that class, but a transitional zone between two subsequent classes. These are the most important areas in terms of degradation, as they have the lowest probability to be in a certain class, hence highest probability to be extended or narrowed down in next or previous class respectively. Eventually, these are the values which can be easily altered, under extrogenic influences, hence are identified as risk areas. The overall desertification risk is derived by incorporating the different risk severity of each parameter using fuzzy rule-based interference system in GIS environment. Multicriteria based geo-statistics are applied to locate the areas under different severity of desertification risk. The study revealed that in Kota, various anthropogenic pressures are accelerating land deterioration, coupled with natural erosive forces. Four major sources of desertification in Kota are, namely Gully and Ravine erosion, inappropriate mining practices, growing urbanization and random deforestation.

  9. An Overview of Techniques for Identifying, Acknowledging and Overcoming Alternate Conceptions in Physics Education.

    Science.gov (United States)

    Klammer, Joel

    This paper examines the nature of physics students' knowledge, the means to identify alternative conceptions, and possible methods to overcome misconceptions. This examination is a survey of the techniques and ideas of a large number of researchers who are seeking their own solutions to this problem. An examination of the nature of knowledge…

  10. Poof Analysis: A technique for Concept Formation

    OpenAIRE

    Bundy, Alan

    1985-01-01

    We report the discovery of an unexpected connection between the invention of the concept of uniform convergence and the occurs check in the unification algorithm. This discovery suggests the invention of further interesting concepts in analysis and a technique for automated concept formation. Part of this technique has been implemented.The discovery arose as part of an attempt to understand the role of proof analysis in mathematical reasoning, so as to incorporate it into a computer program. ...

  11. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  12. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    Science.gov (United States)

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  13. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  14. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  15. Analysis and comparation of animation techniques

    OpenAIRE

    Joštová, Barbora

    2015-01-01

    This thesis is focused on the analysis and comparison of animation techniques. In the theoretical part of the thesis I define key terms, the historical development and the basic principles of animation techniques. In the practical part I describe the comparison between classic and digital types of animation. Based on this research I chose the most suitable animations that are further used to verify my hypothesis. The provided hypothesis is the order of based on how demanding it is in terms of...

  16. Equivalent Dynamic Stiffness Mapping technique for identifying nonlinear structural elements from frequency response functions

    Science.gov (United States)

    Wang, X.; Zheng, G. T.

    2016-02-01

    A simple and general Equivalent Dynamic Stiffness Mapping technique is proposed for identifying the parameters or the mathematical model of a nonlinear structural element with steady-state primary harmonic frequency response functions (FRFs). The Equivalent Dynamic Stiffness is defined as the complex ratio between the internal force and the displacement response of unknown element. Obtained with the test data of responses' frequencies and amplitudes, the real and imaginary part of Equivalent Dynamic Stiffness are plotted as discrete points in a three dimensional space over the displacement amplitude and the frequency, which are called the real and the imaginary Equivalent Dynamic Stiffness map, respectively. These points will form a repeatable surface as the Equivalent Dynamic stiffness is only a function of the corresponding data as derived in the paper. The mathematical model of the unknown element can then be obtained by surface-fitting these points with special functions selected by priori knowledge of the nonlinear type or with ordinary polynomials if the type of nonlinearity is not pre-known. An important merit of this technique is its capability of dealing with strong nonlinearities owning complicated frequency response behaviors such as jumps and breaks in resonance curves. In addition, this technique could also greatly simplify the test procedure. Besides there is no need to pre-identify the underlying linear parameters, the method uses the measured data of excitation forces and responses without requiring a strict control of the excitation force during the test. The proposed technique is demonstrated and validated with four classical single-degree-of-freedom (SDOF) numerical examples and one experimental example. An application of this technique for identification of nonlinearity from multiple-degree-of-freedom (MDOF) systems is also illustrated.

  17. Applications of electrochemical techniques in mineral analysis.

    Science.gov (United States)

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  18. Comparison of remote sensing image processing techniques to identify tornado damage areas from Landsat TM data

    Science.gov (United States)

    Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.P.

    2008-01-01

    Remote sensing techniques have been shown effective for large-scale damage surveys after a hazardous event in both near real-time or post-event analyses. The paper aims to compare accuracy of common imaging processing techniques to detect tornado damage tracks from Landsat TM data. We employed the direct change detection approach using two sets of images acquired before and after the tornado event to produce a principal component composite images and a set of image difference bands. Techniques in the comparison include supervised classification, unsupervised classification, and objectoriented classification approach with a nearest neighbor classifier. Accuracy assessment is based on Kappa coefficient calculated from error matrices which cross tabulate correctly identified cells on the TM image and commission and omission errors in the result. Overall, the Object-oriented Approach exhibits the highest degree of accuracy in tornado damage detection. PCA and Image Differencing methods show comparable outcomes. While selected PCs can improve detection accuracy 5 to 10%, the Object-oriented Approach performs significantly better with 15-20% higher accuracy than the other two techniques. ?? 2008 by MDPI.

  19. Gold analysis by the gamma absorption technique.

    Science.gov (United States)

    Kurtoglu, Arzu; Tugrul, A Beril

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement. PMID:12485656

  20. Using text-mining techniques in electronic patient records to identify ADRs from medicine use

    DEFF Research Database (Denmark)

    Warrer, Pernille; Hansen, Ebba Holme; Jensen, Lars Juhl;

    2012-01-01

    This literature review included studies that use text-mining techniques in narrative documents stored in electronic patient records (EPRs) to investigate ADRs. We searched PubMed, Embase, Web of Science and International Pharmaceutical Abstracts without restrictions from origin until July 2011. We...... and populations, various types of ADRs were identified and thus we could not make comparisons across studies. The review underscores the feasibility and potential of text mining to investigate narrative documents in EPRs for ADRs. However, more empirical studies are needed to evaluate whether text mining of EPRs...... included empirically based studies on text mining of electronic patient records (EPRs) that focused on detecting ADRs, excluding those that investigated adverse events not related to medicine use. We extracted information on study populations, EPR data sources, frequencies and types of the identified ADRs...

  1. Using Rasch Analysis to Identify Uncharacteristic Responses to Undergraduate Assessments

    Science.gov (United States)

    Edwards, Antony; Alcock, Lara

    2010-01-01

    Rasch Analysis is a statistical technique that is commonly used to analyse both test data and Likert survey data, to construct and evaluate question item banks, and to evaluate change in longitudinal studies. In this article, we introduce the dichotomous Rasch model, briefly discussing its assumptions. Then, using data collected in an…

  2. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  3. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  4. BIOELECTRICAL IMPEDANCE VECTOR ANALYSIS IDENTIFIES SARCOPENIA IN NURSING HOME RESIDENTS

    Science.gov (United States)

    Loss of muscle mass and water shifts between body compartments are contributing factors to frailty in the elderly. The body composition changes are especially pronounced in institutionalized elderly. We investigated the ability of single-frequency bioelectrical impedance analysis (BIA) to identify b...

  5. Integrating complementary medicine literacy education into Australian medical curricula: Student-identified techniques and strategies for implementation.

    Science.gov (United States)

    Templeman, Kate; Robinson, Anske; McKenna, Lisa

    2015-11-01

    Formal medical education about complementary medicine (CM) that comprises medicinal products/treatments is required due to possible CM interactions with conventional medicines; however, few guidelines exist on design and implementation of such education. This paper reports findings of a constructivist grounded theory method study that identified key strategies for integrating CM literacy education into medical curricula. Analysis of data from interviews with 30 medical students showed that students supported a longitudinal integrative and pluralistic approach to medicine. Awareness of common patient use, evidence, and information relevant to future clinical practice were identified as focus points needed for CM literacy education. Students advocated for interactive case-based, experiential and dialogical didactic techniques that are multiprofessional and student-centred. Suggested strategies provide key elements of CM literacy within research, field-based practice, and didactic teaching over the entirety of the curriculum. CM educational strategies should address CM knowledge deficits and ultimately respond to patients' needs. PMID:26573450

  6. Integrating complementary medicine literacy education into Australian medical curricula: Student-identified techniques and strategies for implementation.

    Science.gov (United States)

    Templeman, Kate; Robinson, Anske; McKenna, Lisa

    2015-11-01

    Formal medical education about complementary medicine (CM) that comprises medicinal products/treatments is required due to possible CM interactions with conventional medicines; however, few guidelines exist on design and implementation of such education. This paper reports findings of a constructivist grounded theory method study that identified key strategies for integrating CM literacy education into medical curricula. Analysis of data from interviews with 30 medical students showed that students supported a longitudinal integrative and pluralistic approach to medicine. Awareness of common patient use, evidence, and information relevant to future clinical practice were identified as focus points needed for CM literacy education. Students advocated for interactive case-based, experiential and dialogical didactic techniques that are multiprofessional and student-centred. Suggested strategies provide key elements of CM literacy within research, field-based practice, and didactic teaching over the entirety of the curriculum. CM educational strategies should address CM knowledge deficits and ultimately respond to patients' needs.

  7. CONSUMER BEHAVIOR ANALYSIS BY GRAPH MINING TECHNIQUE

    OpenAIRE

    KATSUTOSHI YADA; HIROSHI MOTODA; TAKASHI WASHIO; ASUKA MIYAWAKI

    2006-01-01

    In this paper, we discuss how graph mining system is applied to sales transaction data so as to understand consumer behavior. First, existing research of consumer behavior analysis for sequential purchase pattern is reviewed. Then we propose to represent the complicated customer purchase behavior by a directed graph retaining temporal information in a purchase sequence and apply a graph mining technique to analyze the frequent occurring patterns. In this paper, we demonstrate through the case...

  8. Multispectral and Photoplethysmography Optical Imaging Techniques Identify Important Tissue Characteristics in an Animal Model of Tangential Burn Excision.

    Science.gov (United States)

    Thatcher, Jeffrey E; Li, Weizhi; Rodriguez-Vaqueiro, Yolanda; Squiers, John J; Mo, Weirong; Lu, Yang; Plant, Kevin D; Sellke, Eric; King, Darlene R; Fan, Wensheng; Martinez-Lorenzo, Jose A; DiMaio, J Michael

    2016-01-01

    Burn excision, a difficult technique owing to the training required to identify the extent and depth of injury, will benefit from a tool that can cue the surgeon as to where and how much to resect. We explored two rapid and noninvasive optical imaging techniques in their ability to identify burn tissue from the viable wound bed using an animal model of tangential burn excision. Photoplethysmography (PPG) imaging and multispectral imaging (MSI) were used to image the initial, intermediate, and final stages of burn excision of a deep partial-thickness burn. PPG imaging maps blood flow in the skin's microcirculation, and MSI collects the tissue reflectance spectrum in visible and infrared wavelengths of light to classify tissue based on a reference library. A porcine deep partial-thickness burn model was generated and serial tangential excision accomplished with an electric dermatome set to 1.0 mm depth. Excised eschar was stained with hematoxylin and eosin to determine the extent of burn remaining at each excision depth. We confirmed that the PPG imaging device showed significantly less blood flow where burn tissue was present, and the MSI method could delineate burn tissue in the wound bed from the viable wound bed. These results were confirmed independently by a histological analysis. We found these devices can identify the proper depth of excision, and their images could cue a surgeon as to the preparedness of the wound bed for grafting. These image outputs are expected to facilitate clinical judgment in the operating room.

  9. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  10. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  11. Identifying Potential Areas for Future Urban Development Using Gis-Based Multi Criteria Evaluation Technique

    Directory of Open Access Journals (Sweden)

    Mohammed Khalid Sabbar

    2016-01-01

    Full Text Available Malaysia likes other Asian countries has experienced rapid urbanization due to economic development, industrialization, massive migrations as well as natural population growth. This expansion particularly the unplanned has impacted negatively on farming activities and creates huge pressure arable agriculture areas. Thus, identification of potential sites for future urban development should become important issues in ensuring sustainable development. Therefore, the aim of this paper is to use GIS based multi criteria evaluation technique to identify potential areas for urban development at Balik Pulau, Penang. The study quantified spatial and temporal dynamics of land use/cover changes and identified potential areas for future development. The results indicated that large proportions of agriculture areas had been converted to built-up areas.. Urban areas increased from 1793.2 ha in 1992 to 3235.4 ha in 2002 and became 3987.8 ha in 2010. On the other hand agricultural land decreased from 6171.3ha (53.8% in 1992 to 3883 ha (35. % in 2010. The study, then, produced map showing potential sites for future urban development. The findings also indicated built-up areas would continue to encroach into flat available agricultural land which will be diminished if no restriction imposed. Thus, the information obtained from this study is useful for planners and decision makers in controlling agriculture areas and guiding new development properly.

  12. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  13. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  14. Identifying Innovative Interventions to Promote Healthy Eating Using Consumption-Oriented Food Supply Chain Analysis

    OpenAIRE

    Hawkes, Corinna, ed.

    2009-01-01

    The mapping and analysis of supply chains is a technique increasingly used to address problems in the food system. Yet such supply chain management has not yet been applied as a means of encouraging healthier diets. Moreover, most policies recommended to promote healthy eating focus on the consumer end of the chain. This article proposes a consumption-oriented food supply chain analysis to identify the changes needed in the food supply chain to create a healthier food environment, measured in...

  15. Latent cluster analysis of ALS phenotypes identifies prognostically differing groups.

    Directory of Open Access Journals (Sweden)

    Jeban Ganesalingam

    Full Text Available BACKGROUND: Amyotrophic lateral sclerosis (ALS is a degenerative disease predominantly affecting motor neurons and manifesting as several different phenotypes. Whether these phenotypes correspond to different underlying disease processes is unknown. We used latent cluster analysis to identify groupings of clinical variables in an objective and unbiased way to improve phenotyping for clinical and research purposes. METHODS: Latent class cluster analysis was applied to a large database consisting of 1467 records of people with ALS, using discrete variables which can be readily determined at the first clinic appointment. The model was tested for clinical relevance by survival analysis of the phenotypic groupings using the Kaplan-Meier method. RESULTS: The best model generated five distinct phenotypic classes that strongly predicted survival (p<0.0001. Eight variables were used for the latent class analysis, but a good estimate of the classification could be obtained using just two variables: site of first symptoms (bulbar or limb and time from symptom onset to diagnosis (p<0.00001. CONCLUSION: The five phenotypic classes identified using latent cluster analysis can predict prognosis. They could be used to stratify patients recruited into clinical trials and generating more homogeneous disease groups for genetic, proteomic and risk factor research.

  16. Parameter Trajectory Analysis to Identify Treatment Effects of Pharmacological Interventions

    OpenAIRE

    Tiemann, Christian A.; Vanlier, Joep; Oosterveer, Maaike H.; Albert K Groen; Hilbers, Peter A. J.; Natal A W van Riel

    2013-01-01

    The field of medical systems biology aims to advance understanding of molecular mechanisms that drive disease progression and to translate this knowledge into therapies to effectively treat diseases. A challenging task is the investigation of long-term effects of a (pharmacological) treatment, to establish its applicability and to identify potential side effects. We present a new modeling approach, called Analysis of Dynamic Adaptations in Parameter Trajectories (ADAPT), to analyze the long-t...

  17. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    International Nuclear Information System (INIS)

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  18. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhof, Hendrik, E-mail: Hendrik.Kohlhof@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Heidt, Christoph, E-mail: Christoph.heidt@kispi.uzh.ch [Department of Orthopedic Surgery, University Children' s Hospital Zurich, Steinwiesstrasse 74, 8032 Switzerland (Switzerland); Bähler, Alexandrine, E-mail: Alexandrine.baehler@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Kohl, Sandro, E-mail: sandro.kohl@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Gravius, Sascha, E-mail: sascha.gravius@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Friedrich, Max J., E-mail: Max.Friedrich@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Ziebarth, Kai, E-mail: kai.ziebarth@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Stranzinger, Enno, E-mail: Enno.Stranzinger@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland)

    2015-06-15

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  19. COSIMA data analysis using multivariate techniques

    Directory of Open Access Journals (Sweden)

    J. Silén

    2014-08-01

    Full Text Available We describe how to use multivariate analysis of complex TOF-SIMS spectra introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a crossvalidation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  20. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  1. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  2. A numerical comparison of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  3. Rice Transcriptome Analysis to Identify Possible Herbicide Quinclorac Detoxification Genes

    Directory of Open Access Journals (Sweden)

    Wenying eXu

    2015-09-01

    Full Text Available Quinclorac is a highly selective auxin-type herbicide, and is widely used in the effective control of barnyard grass in paddy rice fields, improving the world’s rice yield. The herbicide mode of action of quinclorac has been proposed and hormone interactions affect quinclorac signaling. Because of widespread use, quinclorac may be transported outside rice fields with the drainage waters, leading to soil and water pollution and environmental health problems.In this study, we used 57K Affymetrix rice whole-genome array to identify quinclorac signaling response genes to study the molecular mechanisms of action and detoxification of quinclorac in rice plants. Overall, 637 probe sets were identified with differential expression levels under either 6 or 24 h of quinclorac treatment. Auxin-related genes such as GH3 and OsIAAs responded to quinclorac treatment. Gene Ontology analysis showed that genes of detoxification-related family genes were significantly enriched, including cytochrome P450, GST, UGT, and ABC and drug transporter genes. Moreover, real-time RT-PCR analysis showed that top candidate P450 families such as CYP81, CYP709C and CYP72A genes were universally induced by different herbicides. Some Arabidopsis genes for the same P450 family were up-regulated under quinclorac treatment.We conduct rice whole-genome GeneChip analysis and the first global identification of quinclorac response genes. This work may provide potential markers for detoxification of quinclorac and biomonitors of environmental chemical pollution.

  4. Proteogenomic Analysis Identifies a Novel Human SHANK3 Isoform

    Directory of Open Access Journals (Sweden)

    Fahad Benthani

    2015-05-01

    Full Text Available Mutations of the SHANK3 gene have been associated with autism spectrum disorder. Individuals harboring different SHANK3 mutations display considerable heterogeneity in their cognitive impairment, likely due to the high SHANK3 transcriptional diversity. In this study, we report a novel interaction between the Mutated in colorectal cancer (MCC protein and a newly identified SHANK3 protein isoform in human colon cancer cells and mouse brain tissue. Hence, our proteogenomic analysis identifies a new human long isoform of the key synaptic protein SHANK3 that was not predicted by the human reference genome. Taken together, our findings describe a potential new role for MCC in neurons, a new human SHANK3 long isoform and, importantly, highlight the use of proteomic data towards the re-annotation of GC-rich genomic regions.

  5. Parameter trajectory analysis to identify treatment effects of pharmacological interventions.

    Directory of Open Access Journals (Sweden)

    Christian A Tiemann

    Full Text Available The field of medical systems biology aims to advance understanding of molecular mechanisms that drive disease progression and to translate this knowledge into therapies to effectively treat diseases. A challenging task is the investigation of long-term effects of a (pharmacological treatment, to establish its applicability and to identify potential side effects. We present a new modeling approach, called Analysis of Dynamic Adaptations in Parameter Trajectories (ADAPT, to analyze the long-term effects of a pharmacological intervention. A concept of time-dependent evolution of model parameters is introduced to study the dynamics of molecular adaptations. The progression of these adaptations is predicted by identifying necessary dynamic changes in the model parameters to describe the transition between experimental data obtained during different stages of the treatment. The trajectories provide insight in the affected underlying biological systems and identify the molecular events that should be studied in more detail to unravel the mechanistic basis of treatment outcome. Modulating effects caused by interactions with the proteome and transcriptome levels, which are often less well understood, can be captured by the time-dependent descriptions of the parameters. ADAPT was employed to identify metabolic adaptations induced upon pharmacological activation of the liver X receptor (LXR, a potential drug target to treat or prevent atherosclerosis. The trajectories were investigated to study the cascade of adaptations. This provided a counter-intuitive insight concerning the function of scavenger receptor class B1 (SR-B1, a receptor that facilitates the hepatic uptake of cholesterol. Although activation of LXR promotes cholesterol efflux and -excretion, our computational analysis showed that the hepatic capacity to clear cholesterol was reduced upon prolonged treatment. This prediction was confirmed experimentally by immunoblotting measurements of SR-B1

  6. Identifying clinical course patterns in SMS data using cluster analysis

    DEFF Research Database (Denmark)

    Kent, Peter; Kongsted, Alice

    2012-01-01

    ABSTRACT: BACKGROUND: Recently, there has been interest in using the short message service (SMS or text messaging), to gather frequent information on the clinical course of individual patients. One possible role for identifying clinical course patterns is to assist in exploring clinically important...... clinically interpretable and different from those of the whole group. Similar patterns were obtained when the number of SMS time points was reduced to monthly. The advantages and disadvantages of this method were contrasted to that of first transforming SMS data by spline analysis. CONCLUSIONS: This study...

  7. Efficient Isothermal Titration Calorimetry Technique Identifies Direct Interaction of Small Molecule Inhibitors with the Target Protein.

    Science.gov (United States)

    Gal, Maayan; Bloch, Itai; Shechter, Nelia; Romanenko, Olga; Shir, Ofer M

    2016-01-01

    Protein-protein interactions (PPI) play a critical role in regulating many cellular processes. Finding novel PPI inhibitors that interfere with specific binding of two proteins is considered a great challenge, mainly due to the complexity involved in characterizing multi-molecular systems and limited understanding of the physical principles governing PPIs. Here we show that the combination of virtual screening techniques, which are capable of filtering a large library of potential small molecule inhibitors, and a unique secondary screening by isothermal titration calorimetry, a label-free method capable of observing direct interactions, is an efficient tool for finding such an inhibitor. In this study we applied this strategy in a search for a small molecule capable of interfering with the interaction of the tumor-suppressor p53 and the E3-ligase MDM2. We virtually screened a library of 15 million small molecules that were filtered to a final set of 80 virtual hits. Our in vitro experimental assay, designed to validate the activity of mixtures of compounds by isothermal titration calorimetry, was used to identify an active molecule against MDM2. At the end of the process the small molecule (4S,7R)-4-(4-chlorophenyl)-5-hydroxy-2,7-dimethyl-N-(6-methylpyridin-2-yl)-4,6,7,8 tetrahydrIoquinoline-3-carboxamide was found to bind MDM2 with a dissociation constant of ~2 µM. Following the identification of this single bioactive compound, spectroscopic measurements were used to further characterize the interaction of the small molecule with the target protein. 2D NMR spectroscopy was used to map the binding region of the small molecule, and fluorescence polarization measurement confirmed that it indeed competes with p53.

  8. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  9. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  10. Bone feature analysis using image processing techniques.

    Science.gov (United States)

    Liu, Z Q; Austin, T; Thomas, C D; Clement, J G

    1996-01-01

    In order to establish the correlation between bone structure and age, and information about age-related bone changes, it is necessary to study microstructural features of human bone. Traditionally, in bone biology and forensic science, the analysis if bone cross-sections has been carried out manually. Such a process is known to be slow, inefficient and prone to human error. Consequently, the results obtained so far have been unreliable. In this paper we present a new approach to quantitative analysis of cross-sections of human bones using digital image processing techniques. We demonstrate that such a system is able to extract various bone features consistently and is capable of providing more reliable data and statistics for bones. Consequently, we will be able to correlate features of bone microstructure with age and possibly also with age related bone diseases such as osteoporosis. The development of knowledge-based computer vision-systems for automated bone image analysis can now be considered feasible.

  11. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  12. A Technique for Tracking the Reading Rate to Identify the E-Book Reading Behaviors and Comprehension Outcomes of Elementary School Students

    Science.gov (United States)

    Huang, Yueh-Min; Liang, Tsung-Ho

    2015-01-01

    Tracking individual reading behaviors is a difficult task, as is carrying out real-time recording and analysis throughout the reading process, but these aims are worth pursuing. In this study, the reading rate is adopted as an indicator to identify different reading behaviors and comprehension outcomes. A reading rate tracking technique is thus…

  13. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  14. Liver Ultrasound Image Analysis using Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    Smriti Sahu, Maheedhar Dubey, Mohammad Imroze Khan

    2012-12-01

    Full Text Available Liver cancer is the sixth most common malignanttumour and the third most common cause ofcancer-related deaths worldwide. Chronic Liverdamage affects up to 20% of our population. It hasmany causes - viral infections (Hepatitis B and C,toxins, genetic, metabolic and autoimmune diseases.The rate of liver cancer in Australia has increasedfour-fold in the past 20 years. For detection andqualitative diagnosis of liver diseases, Ultrasound(US image is an easy-to-use and minimally invasiveimaging modality. Medical images are oftendeteriorated by noise due to various sources ofinterferences and other phenomena known asSpeckle noise. Therefore it is required to apply somedigital image processing techniques for smoothingor suppression of speckle noise in ultrasoundimages. This paper attempts to undertake the studythree types of the image enhancement techniquesincluding, Shock Filter, Contrast Limited AdaptiveHistogram Equalization (CLAHE and Spatialfilter. These smoothing techniques are comparedusing performance matrices Peak Signal to NoiseRatio (PSNR and Mean Square Error (MSE. Ithas been observed that the Spatial high pass filtergives the better performance than others for liverultrasound image analysis.

  15. Window technique for climate trend analysis

    Science.gov (United States)

    Szentimrey, Tamás; Faragó, Tibor; Szalai, Sándor

    1992-01-01

    Climatic characteristics are affected by various systematic and occasional impacts: besides the changes in the observing system (locations of the stations of the meteorological network, instruments, observing procedures), the possible local-scale and global natural and antropogenic impacts on climatic conditions should be taken into account. Apart from the predictability problems, the phenomenological analysis of the climatic variability and the determination of past persistent climatic anomalies are significant problems, among other aspects, as evidence of the possible anomalous behavior of climate or for climate impact studies. In this paper, a special technique for the identification of such “shifts” in the observational series is presented. The existence of these significant shorter or longer term changes in the mean characteristics for the properly selected adjoining periods of time is the necessary condition for the formation of any more or less unidirectional climatic trends. Actually, the window technique is based on a complete set of orthogonal functions. The sensitivity of the proposed model on its main parameters is also investigated. This method is applied for hemispheric and Hungarian data series of the mean annual surface temperature.

  16. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters (125I, 57Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  17. Cluster analysis of clinical data identifies fibromyalgia subgroups.

    Directory of Open Access Journals (Sweden)

    Elisa Docampo

    Full Text Available INTRODUCTION: Fibromyalgia (FM is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. MATERIAL AND METHODS: 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. RESULTS: VARIABLES CLUSTERED INTO THREE INDEPENDENT DIMENSIONS: "symptomatology", "comorbidities" and "clinical scales". Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1, high symptomatology and comorbidities (Cluster 2, and high symptomatology but low comorbidities (Cluster 3, showing differences in measures of disease severity. CONCLUSIONS: We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment.

  18. Longitudinal Metagenomic Analysis of Hospital Air Identifies Clinically Relevant Microbes

    Science.gov (United States)

    King, Paula; Pham, Long K.; Waltz, Shannon; Sphar, Dan; Yamamoto, Robert T.; Conrad, Douglas; Taplitz, Randy; Torriani, Francesca

    2016-01-01

    We describe the sampling of sixty-three uncultured hospital air samples collected over a six-month period and analysis using shotgun metagenomic sequencing. Our primary goals were to determine the longitudinal metagenomic variability of this environment, identify and characterize genomes of potential pathogens and determine whether they are atypical to the hospital airborne metagenome. Air samples were collected from eight locations which included patient wards, the main lobby and outside. The resulting DNA libraries produced 972 million sequences representing 51 gigabases. Hierarchical clustering of samples by the most abundant 50 microbial orders generated three major nodes which primarily clustered by type of location. Because the indoor locations were longitudinally consistent, episodic relative increases in microbial genomic signatures related to the opportunistic pathogens Aspergillus, Penicillium and Stenotrophomonas were identified as outliers at specific locations. Further analysis of microbial reads specific for Stenotrophomonas maltophilia indicated homology to a sequenced multi-drug resistant clinical strain and we observed broad sequence coverage of resistance genes. We demonstrate that a shotgun metagenomic sequencing approach can be used to characterize the resistance determinants of pathogen genomes that are uncharacteristic for an otherwise consistent hospital air microbial metagenomic profile. PMID:27482891

  19. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  20. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  1. A Critical Analysis of Anesthesiology Podcasts: Identifying Determinants of Success

    Science.gov (United States)

    Singh, Devin; Matava, Clyde

    2016-01-01

    Background Audio and video podcasts have gained popularity in recent years. Increasingly, podcasts are being used in the field of medicine as a tool to disseminate information. This format has multiple advantages including highly accessible creation tools, low distribution costs, and portability for the user. However, despite its ongoing use in medical education, there are no data describing factors associated with the success or quality of podcasts. Objective The goal of the study was to assess the landscape of anesthesia podcasts in Canada and develop a methodology for evaluating the quality of the podcast. To achieve our objective, we identified the scope of podcasts in anesthesia specifically, constructed an algorithmic model for measuring success, and identified factors linked to both successful podcasts and a peer-review process. Methods Independent reviewers performed a systematic search of anesthesia-related podcasts on iTunes Canada. Data and metrics recorded for each podcast included podcast’s authorship, number posted, podcast series duration, target audience, topics, and social media presence. Descriptive statistics summarized mined data, and univariate analysis was used to identify factors associated with podcast success and a peer-review process. Results Twenty-two podcasts related to anesthesia were included in the final analysis. Less than a third (6/22=27%) were still active. The median longevity of the podcasts’ series was just 13 months (interquartile range: 1-39 months). Anesthesiologists were the target audience for 77% of podcast series with clinical topics being most commonly addressed. We defined a novel algorithm for measuring success: Podcast Success Index. Factors associated with a high Podcast Success Index included podcasts targeting fellows (Spearman R=0.434; P=.04), inclusion of professional topics (Spearman R=0.456-0.603; P=.01-.03), and the use of Twitter as a means of social media (Spearman R=0.453;P=.03). In addition, more

  2. Analysis of an Image Secret Sharing Scheme to Identify Cheaters

    Directory of Open Access Journals (Sweden)

    Jung-San LEe

    2010-09-01

    Full Text Available Secret image sharing mechanisms have been widely applied to the military, e-commerce, and communications fields. Zhao et al. introduced the concept of cheater detection into image sharing schemes recently. This functionality enables the image owner and authorized members to identify the cheater in reconstructing the secret image. Here, we provide an analysis of Zhao et al.¡¦s method: an authorized participant is able to restore the secret image by him/herself. This contradicts the requirement of secret image sharing schemes. The authorized participant utilizes an exhaustive search to achieve the attempt, though, simulation results show that it can be done within a reasonable time period.

  3. Identifying avian sources of faecal contamination using sterol analysis.

    Science.gov (United States)

    Devane, Megan L; Wood, David; Chappell, Andrew; Robson, Beth; Webster-Brown, Jenny; Gilpin, Brent J

    2015-10-01

    Discrimination of the source of faecal pollution in water bodies is an important step in the assessment and mitigation of public health risk. One tool for faecal source tracking is the analysis of faecal sterols which are present in faeces of animals in a range of distinctive ratios. Published ratios are able to discriminate between human and herbivore mammal faecal inputs but are of less value for identifying pollution from wildfowl, which can be a common cause of elevated bacterial indicators in rivers and streams. In this study, the sterol profiles of 50 avian-derived faecal specimens (seagulls, ducks and chickens) were examined alongside those of 57 ruminant faeces and previously published sterol profiles of human wastewater, chicken effluent and animal meatwork effluent. Two novel sterol ratios were identified as specific to avian faecal scats, which, when incorporated into a decision tree with human and herbivore mammal indicative ratios, were able to identify sterols from avian-polluted waterways. For samples where the sterol profile was not consistent with herbivore mammal or human pollution, avian pollution is indicated when the ratio of 24-ethylcholestanol/(24-ethylcholestanol + 24-ethylcoprostanol + 24-ethylepicoprostanol) is ≥0.4 (avian ratio 1) and the ratio of cholestanol/(cholestanol + coprostanol + epicoprostanol) is ≥0.5 (avian ratio 2). When avian pollution is indicated, further confirmation by targeted PCR specific markers can be employed if greater confidence in the pollution source is required. A 66% concordance between sterol ratios and current avian PCR markers was achieved when 56 water samples from polluted waterways were analysed.

  4. Social network analysis in identifying influential webloggers: A preliminary study

    Science.gov (United States)

    Hasmuni, Noraini; Sulaiman, Nor Intan Saniah; Zaibidi, Nerda Zura

    2014-12-01

    In recent years, second generation of internet-based services such as weblog has become an effective communication tool to publish information on the Web. Weblogs have unique characteristics that deserve users' attention. Some of webloggers have seen weblogs as appropriate medium to initiate and expand business. These webloggers or also known as direct profit-oriented webloggers (DPOWs) communicate and share knowledge with each other through social interaction. However, survivability is the main issue among DPOW. Frequent communication with influential webloggers is one of the way to keep survive as DPOW. This paper aims to understand the network structure and identify influential webloggers within the network. Proper understanding of the network structure can assist us in knowing how the information is exchanged among members and enhance survivability among DPOW. 30 DPOW were involved in this study. Degree centrality and betweenness centrality measurement in Social Network Analysis (SNA) were used to examine the strength relation and identify influential webloggers within the network. Thus, webloggers with the highest value of these measurements are considered as the most influential webloggers in the network.

  5. Application of Spectral Change Detection Techniques to Identify Forest Harvesting Using Landsat TM Data

    OpenAIRE

    Chambers, Samuel David

    2002-01-01

    The main objective of this study was to determine the spectral change technique best suited to detect complete forest harvests (clearcuts) in the Southern United States. In the pursuit of this objective eight existing change detection techniques were quantitatively evaluated and a hybrid method was also developed. Secondary objectives were to determine the impact of atmospheric corrections applied before the change detection, and the affect post-processing methods to eliminate small groups ...

  6. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  7. Identifying a preservation zone using multi–criteria decision analysis

    Directory of Open Access Journals (Sweden)

    Farashi, A.

    2016-03-01

    Full Text Available Zoning of a protected area is an approach to partition landscape into various land use units. The management of these landscape units can reduce conflicts caused by human activities. Tandoreh National Park is one of the most biologically diverse, protected areas in Iran. Although the area is generally designed to protect biodiversity, there are many conflicts between biodiversity conservation and human activities. For instance, the area is highly controversial and has been considered as an impediment to local economic development, such as tourism, grazing, road construction, and cultivation. In order to reduce human conflicts with biodiversity conservation in Tandoreh National Park, safe zones need to be established and human activities need to be moved out of the zones. In this study we used a systematic methodology to integrate a participatory process with Geographic Information Systems (GIS using a multi–criteria decision analysis (MCDA technique to guide a zoning scheme for the Tandoreh National Park, Iran. Our results show that the northern and eastern parts of the Tandoreh National Park that were close to rural areas and farmlands returned less desirability for selection as a preservation area. Rocky Mountains were the most important and most destructed areas and abandoned plains were the least important criteria for preservation in the area. Furthermore, the results reveal that the land properties were considered to be important for protection based on the obtaine

  8. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  9. Can Passive Mobile Application Traffic be Identified using Machine Learning Techniques

    OpenAIRE

    Holland, Peter

    2015-01-01

    Mobile phone applications (apps) can generate background traffic when the end-user is not actively using the app. If this background traffic could be accurately identified, network operators could de-prioritise this traffic and free up network bandwidth for priority network traffic. The background app traffic should have IP packet features that could be utilised by a machine learning algorithm to identify app-generated (passive) traffic as opposed to user-generated (active) traffic. Previous ...

  10. Intelligent Technique for Signal Processing to Identify the Brain Disorder for Epilepsy Captures Using Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Gurumurthy Sasikumar

    2016-01-01

    Full Text Available The new direction of understand the signal that is created from the brain organization is one of the main chores in the brain signal processing. Amid all the neurological disorders the human brain epilepsy is measured as one of the extreme prevalent and then programmed artificial intelligence detection technique is an essential due to the crooked and unpredictable nature of happening of epileptic seizures. We proposed an Improved Fuzzy firefly algorithm, which would enhance the classification of the brain signal efficiently with minimum iteration. An important bunching technique created on fuzzy logic is the Fuzzy C means. Together in the feature domain with the spatial domain the features gained after multichannel EEG signals remained combined by means of fuzzy algorithms. And for better precision segmentation process the firefly algorithm is applied to optimize the Fuzzy C-means membership function. Simultaneously for the efficient clustering method the convergence criteria are set. On the whole the proposed technique yields more accurate results and that gives an edge over other techniques. This proposed algorithm result compared with other algorithms like fuzzy c means algorithm and PSO algorithm.

  11. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  12. Directional reflectance analysis for identifying counterfeit drugs: Preliminary study.

    Science.gov (United States)

    Wilczyński, Sławomir; Koprowski, Robert; Błońska-Fajfrowska, Barbara

    2016-05-30

    The WHO estimates that up to 10% of drugs on the market may be counterfeit. In order to prevent intensification of the phenomenon of drug counterfeiting, the methods for distinguishing genuine medicines from fake ones need to be developed. The aim of this study was to try to develop simple, reproducible and inexpensive method for distinguishing between original and counterfeit medicines based on the measurement of directional reflectance. The directional reflectance of 6 original Viagra(®) tablets (Pfizer) and 24 (4 different batches) counterfeit tablets (imitating Viagra(®)) was examined in six spectral bands: from 0.9 to 1.1 μm, from 1.9 to 2.6 μm, from 3.0 to 4.0 μm, from 3.0 to 5.0 μm, from 4.0 to 5.0 μm, from 8.0 to 12.0 μm, and for two angles of incidence, 20° and 60°. Directional hemispherical reflectometer was applied to measure directional reflectance. Significant statistical differences between the directional reflectance of the original Viagra(®) and counterfeit tablets were registered. Any difference in the value of directional reflectance for any spectral band or angle of incidence identifies the drug as a fake one. The proposed method of directional reflectance analysis enables to differentiate between the real Viagra(®) and fake tablets. Directional reflectance analysis is a fast (measurement time under 5s), cheap and reproducible method which does not require expensive equipment or specialized laboratory staff. It also seems to be an effective method, however, the effectiveness will be assessed after the extension of research. PMID:26977587

  13. Techniques for identifying cross-disciplinary and 'hard-to-detect' evidence for systematic review.

    Science.gov (United States)

    O'Mara-Eves, Alison; Brunton, Ginny; McDaid, David; Kavanagh, Josephine; Oliver, Sandy; Thomas, James

    2014-03-01

    Driven by necessity in our own complex review, we developed alternative systematic ways of identifying relevant evidence where the key concepts are generally not focal to the primary studies' aims and are found across multiple disciplines-that is, hard-to-detect evidence. Specifically, we sought to identify evidence on community engagement in public health interventions that aim to reduce health inequalities. Our initial search strategy used text mining to identify synonyms for the concept 'community engagement'. We conducted a systematic search for reviews on public health interventions, supplemented by searches of trials databases. We then used information in the reviews' evidence tables to gather more information about the included studies than was evident in the primary studies' own titles or abstracts. We identified 319 primary studies cited in reviews after full-text screening. In this paper, we retrospectively reflect on the challenges and benefits of the approach taken. We estimate that more than a quarter of the studies that were identified would have been missed by typical searching and screening methods. This identification strategy was highly effective and could be useful for reviews of broad research questions, or where the key concepts are unlikely to be the main focus of primary research. PMID:26054025

  14. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  15. Global secretome analysis identifies novel mediators of bone metastasis

    Institute of Scientific and Technical Information of China (English)

    Mario Andres Blanco; Gary LeRoy; Zia Khan; Ma(s)a Ale(c)kovi(c); Barry M Zee; Benjamin A Garcia; Yibin Kang

    2012-01-01

    Bone is the one of the most common sites of distant metastasis of solid tumors.Secreted proteins are known to influence pathological interactions between metastatic cancer cells and the bone stroma.To comprehensively profile secreted proteins associated with bone metastasis,we used quantitative and non-quantitative mass spectrometry to globally analyze the secretomes of nine cell lines of varying bone metastatic ability from multiple species and cancer types.By comparing the secretomes of parental cells and their bone metastatic derivatives,we identified the secreted proteins that were uniquely associated with bone metastasis in these cell lines.We then incorporated bioinformatic analyses of large clinical metastasis datasets to obtain a list of candidate novel bone metastasis proteins of several functional classes that were strongly associated with both clinical and experimental bone metastasis.Functional validation of selected proteins indicated that in vivo bone metastasis can be promoted by high expression of (1) the salivary cystatins CST1,CST2,and CST4; (2) the plasminogen activators PLAT and PLAU; or (3) the collagen functionality proteins PLOD2 and COL6A1.Overall,our study has uncovered several new secreted mediators of bone metastasis and therefore demonstrated that secretome analysis is a powerful method for identification of novel biomarkers and candidate therapeutic targets.

  16. Applications of nuclear analytical techniques for identifying the origin and composition of found radioactive materials

    International Nuclear Information System (INIS)

    Radioactive materials and sources have been used worldwide for the last 100 years - for medical diagnosis and therapy, industrial imaging and process monitoring, consumer applications, materials and biological research, and for generating nuclear energy - among other peaceful purposes. Many of the radioactive materials have been produced, and the associated nuclear science and technology developed, at major research sites such as the Chalk River Laboratories in Ontario, Canada. Sometimes undocumented radioactive materials associated with production, development or use are found, usually in the context of a legacy setting, and their composition and origin needs to be determined in order for these materials to be safely handled and securely dispositioned. The novel applications of nuclear analytical techniques, including mass spectroscopy, gamma and x-ray spectroscopy and neutron beam irradiation techniques, is presented in the context of some recent investigations. (author)

  17. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  18. Identifying and ranking the human resources management criteria influencing on organizational performance using MADM Fuzzy techniques

    OpenAIRE

    Saeed Safari; Mohammad Vazin Karimian; Ali Khosravi

    2014-01-01

    Human resources management plays essential role for the success of organizations. This paper presents an empirical investigation to determine human resource management main criteria and sub-criteria based on a survey on the existing literatures and theoretical principles. The study has been applied in a municipality organization in Iran. The study uses analytical hierarchy process as well as fuzzy technique for order preference by similarity to ideal solution (TOPSIS) for prioritizing decisio...

  19. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  20. Methods and Techniques of Sampling, Culturing and Identifying of Subsurface Bacteria

    International Nuclear Information System (INIS)

    This report described sampling, culturing and identifying of KURT underground bacteria, which existed as iron-, manganese-, and sulfate-reducing bacteria. The methods of culturing and media preparation were different by bacteria species affecting bacteria growth-rates. It will be possible for the cultured bacteria to be used for various applied experiments and researches in the future

  1. Real-time analysis application for identifying bursty local areas related to emergency topics.

    Science.gov (United States)

    Sakai, Tatsuhiro; Tamura, Keiichi

    2015-01-01

    Since social media started getting more attention from users on the Internet, social media has been one of the most important information source in the world. Especially, with the increasing popularity of social media, data posted on social media sites are rapidly becoming collective intelligence, which is a term used to refer to new media that is displacing traditional media. In this paper, we focus on geotagged tweets on the Twitter site. These geotagged tweets are referred to as georeferenced documents because they include not only a short text message, but also the documents' posting time and location. Many researchers have been tackling the development of new data mining techniques for georeferenced documents to identify and analyze emergency topics, such as natural disasters, weather, diseases, and other incidents. In particular, the utilization of geotagged tweets to identify and analyze natural disasters has received much attention from administrative agencies recently because some case studies have achieved compelling results. In this paper, we propose a novel real-time analysis application for identifying bursty local areas related to emergency topics. The aim of our new application is to provide new platforms that can identify and analyze the localities of emergency topics. The proposed application is composed of three core computational intelligence techniques: the Naive Bayes classifier technique, the spatiotemporal clustering technique, and the burst detection technique. Moreover, we have implemented two types of application interface: a Web application interface and an android application interface. To evaluate the proposed application, we have implemented a real-time weather observation system embedded the proposed application. we used actual crawling geotagged tweets posted on the Twitter site. The weather observation system successfully detected bursty local areas related to observed emergency weather topics. PMID:25918679

  2. Identifying desertification risk areas using fuzzy membership and geospatial technique – A case study, Kota District, Rajasthan

    Indian Academy of Sciences (India)

    Arunima Dasgupta; K L N Sastry; P S Dhinwa; V S Rathore; M S Nathawat

    2013-08-01

    Desertification risk assessment is important in order to take proper measures for its prevention. Present research intends to identify the areas under risk of desertification along with their severity in terms of degradation in natural parameters. An integrated model with fuzzy membership analysis, fuzzy rulebased inference system and geospatial techniques was adopted, including five specific natural parameters namely slope, soil pH, soil depth, soil texture and NDVI. Individual parameters were classified according to their deviation from mean. Membership of each individual values to be in a certain class was derived using the normal probability density function of that class. Thus if a single class of a single parameter is with mean and standard deviation , the values falling beyond + 2 and − 2 are not representing that class, but a transitional zone between two subsequent classes. These are the most important areas in terms of degradation, as they have the lowest probability to be in a certain class, hence highest probability to be extended or narrowed down in next or previous class respectively. Eventually, these are the values which can be easily altered, under extrogenic influences, hence are identified as risk areas. The overall desertification risk is derived by incorporating the different risk severity of each parameter using fuzzy rule-based interference system in GIS environment. Multicriteria based geo-statistics are applied to locate the areas under different severity of desertification risk. The study revealed that in Kota, various anthropogenic pressures are accelerating land deterioration, coupled with natural erosive forces. Four major sources of desertification in Kota are, namely Gully and Ravine erosion, inappropriate mining practices, growing urbanization and random deforestation.

  3. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  4. Image processing techniques for identifying Mycobacterium tuberculosis in Ziehl-Neelsen stains.

    Science.gov (United States)

    Sadaphal, P; Rao, J; Comstock, G W; Beg, M F

    2008-05-01

    Worldwide, laboratory technicians tediously read sputum smears for tuberculosis (TB) diagnosis. We demonstrate proof of principle of an innovative computational algorithm that successfully recognizes Ziehl-Neelsen (ZN) stained acid-fast bacilli (AFB) in digital images. Automated, multi-stage, color-based Bayesian segmentation identified possible 'TB objects', removed artifacts by shape comparison and color-labeled objects as 'definite', 'possible' or 'non-TB', bypassing photomicrographic calibration. Superimposed AFB clusters, extreme stain variation and low depth of field were challenges. Our novel method facilitates electronic diagnosis of TB, permitting wider application in developing countries where fluorescent microscopy is currently inaccessible and unaffordable. We plan refinement and validation in the future.

  5. Identifying and ranking the human resources management criteria influencing on organizational performance using MADM Fuzzy techniques

    Directory of Open Access Journals (Sweden)

    Saeed Safari

    2014-07-01

    Full Text Available Human resources management plays essential role for the success of organizations. This paper presents an empirical investigation to determine human resource management main criteria and sub-criteria based on a survey on the existing literatures and theoretical principles. The study has been applied in a municipality organization in Iran. The study uses analytical hierarchy process as well as fuzzy technique for order preference by similarity to ideal solution (TOPSIS for prioritizing decision tree criteria. The results indicate job design and human resource planning criteria are ranked as the highest ones. In addition, employee recruitment and selection, employee health and hygiene, training and development and compensation system criteria are other important criteria.

  6. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  7. Combining digital watermarking and fingerprinting techniques to identify copyrights for color images.

    Science.gov (United States)

    Hsieh, Shang-Lin; Chen, Chun-Che; Shen, Wen-Shan

    2014-01-01

    This paper presents a copyright identification scheme for color images that takes advantage of the complementary nature of watermarking and fingerprinting. It utilizes an authentication logo and the extracted features of the host image to generate a fingerprint, which is then stored in a database and also embedded in the host image to produce a watermarked image. When a dispute over the copyright of a suspect image occurs, the image is first processed by watermarking. If the watermark can be retrieved from the suspect image, the copyright can then be confirmed; otherwise, the watermark then serves as the fingerprint and is processed by fingerprinting. If a match in the fingerprint database is found, then the suspect image will be considered a duplicated one. Because the proposed scheme utilizes both watermarking and fingerprinting, it is more robust than those that only adopt watermarking, and it can also obtain the preliminary result more quickly than those that only utilize fingerprinting. The experimental results show that when the watermarked image suffers slight attacks, watermarking alone is enough to identify the copyright. The results also show that when the watermarked image suffers heavy attacks that render watermarking incompetent, fingerprinting can successfully identify the copyright, hence demonstrating the effectiveness of the proposed scheme.

  8. Combining Digital Watermarking and Fingerprinting Techniques to Identify Copyrights for Color Images

    Directory of Open Access Journals (Sweden)

    Shang-Lin Hsieh

    2014-01-01

    Full Text Available This paper presents a copyright identification scheme for color images that takes advantage of the complementary nature of watermarking and fingerprinting. It utilizes an authentication logo and the extracted features of the host image to generate a fingerprint, which is then stored in a database and also embedded in the host image to produce a watermarked image. When a dispute over the copyright of a suspect image occurs, the image is first processed by watermarking. If the watermark can be retrieved from the suspect image, the copyright can then be confirmed; otherwise, the watermark then serves as the fingerprint and is processed by fingerprinting. If a match in the fingerprint database is found, then the suspect image will be considered a duplicated one. Because the proposed scheme utilizes both watermarking and fingerprinting, it is more robust than those that only adopt watermarking, and it can also obtain the preliminary result more quickly than those that only utilize fingerprinting. The experimental results show that when the watermarked image suffers slight attacks, watermarking alone is enough to identify the copyright. The results also show that when the watermarked image suffers heavy attacks that render watermarking incompetent, fingerprinting can successfully identify the copyright, hence demonstrating the effectiveness of the proposed scheme.

  9. Integrating subpathway analysis to identify candidate agents for hepatocellular carcinoma.

    Science.gov (United States)

    Wang, Jiye; Li, Mi; Wang, Yun; Liu, Xiaoping

    2016-01-01

    Hepatocellular carcinoma (HCC) is the second most common cause of cancer-associated death worldwide, characterized by a high invasiveness and resistance to normal anticancer treatments. The need to develop new therapeutic agents for HCC is urgent. Here, we developed a bioinformatics method to identify potential novel drugs for HCC by integrating HCC-related and drug-affected subpathways. By using the RNA-seq data from the TCGA (The Cancer Genome Atlas) database, we first identified 1,763 differentially expressed genes between HCC and normal samples. Next, we identified 104 significant HCC-related subpathways. We also identified the subpathways associated with small molecular drugs in the CMap database. Finally, by integrating HCC-related and drug-affected subpathways, we identified 40 novel small molecular drugs capable of targeting these HCC-involved subpathways. In addition to previously reported agents (ie, calmidazolium), our method also identified potentially novel agents for targeting HCC. We experimentally verified that one of these novel agents, prenylamine, induced HCC cell apoptosis using 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide, an acridine orange/ethidium bromide stain, and electron microscopy. In addition, we found that prenylamine not only affected several classic apoptosis-related proteins, including Bax, Bcl-2, and cytochrome c, but also increased caspase-3 activity. These candidate small molecular drugs identified by us may provide insights into novel therapeutic approaches for HCC. PMID:27022281

  10. A computational technique to identify the optimal stiffness matrix for a discrete nuclear fuel assembly model

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu, E-mail: nkpark@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Kyoung-Joo, E-mail: kyoungjoo@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Kyoung-Hong, E-mail: kyounghong@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Suh, Jung-Min, E-mail: jmsuh@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of)

    2013-02-15

    Highlights: ► An identification method of the optimal stiffness matrix for a fuel assembly structure is discussed. ► The least squares optimization method is introduced, and a closed form solution of the problem is derived. ► The method can be expanded to the system with the limited number of modes. ► Identification error due to the perturbed mode shape matrix is analyzed. ► Verification examples show that the proposed procedure leads to a reliable solution. -- Abstract: A reactor core structural model which is used to evaluate the structural integrity of the core contains nuclear fuel assembly models. Since the reactor core consists of many nuclear fuel assemblies, the use of a refined fuel assembly model leads to a considerable amount of computing time for performing nonlinear analyses such as the prediction of seismic induced vibration behaviors. The computational time could be reduced by replacing the detailed fuel assembly model with a simplified model that has fewer degrees of freedom, but the dynamic characteristics of the detailed model must be maintained in the simplified model. Such a model based on an optimal design method is proposed in this paper. That is, when a mass matrix and a mode shape matrix are given, the optimal stiffness matrix of a discrete fuel assembly model can be estimated by applying the least squares minimization method. The verification of the method is completed by comparing test results and simulation results. This paper shows that the simplified model's dynamic behaviors are quite similar to experimental results and that the suggested method is suitable for identifying reliable mathematical model for fuel assemblies.

  11. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  12. Hybrid chemical and nondestructive analysis technique

    International Nuclear Information System (INIS)

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  13. Hybrid chemical and nondestructive-analysis technique

    Energy Technology Data Exchange (ETDEWEB)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities.

  14. Identifying Factors and Techniques to Decrease the Positive Margin Rate in Partial Mastectomies: Have We Missed the Mark?

    Science.gov (United States)

    Edwards, Sara B; Leitman, I Michael; Wengrofsky, Aaron J; Giddins, Marley J; Harris, Emily; Mills, Christopher B; Fukuhara, Shinichi; Cassaro, Sebastiano

    2016-05-01

    Breast conservation therapy (BCT) has a reported incidence of positive margins ranging widely in the literature from 20% to 70%. Efforts have been made to refine standards for partial mastectomy and to predict which patients are at highest risk for incomplete excision. Most have focused on histology and demographics. We sought to further define modifiable risk factors for positive margins and residual disease. A retrospective study was conducted of 567 consecutive partial mastectomies by 21 breast and general surgeons from 2009 to 2012. Four hundred fourteen cases of neoplasm were reviewed for localization, intraoperative assessment, excision technique, rates, and results of re-excision/mastectomy. Histologic margins were positive in 23% of patients, 25% had margins 0.1-0.9 mm, and 7% had tumor within 1-1.9 mm. Residual tumor was identified at-in 61 cases: 38% (disease at margin), 21% (0.1-0.9 mm), and 14% (1-1.9 mm). Ductal carcinoma in situ (DCIS) was present in 85% of residual disease on re-excision and correlated to higher rates of re-excision (p = neoplasms was associated with 2-3 times the likelihood for positive margins than when a single needle was required. The removal of additional margins at initial surgery correlated with improved rates of complete excision when DCIS was present. Patients must have careful analysis of specimen margins at the time of surgery and may benefit from additional tissue excision or routine shaving of the cavity of resection. Surgeons should conduct careful patient selection for BCT, in the context of multifocal, and multicentric disease. Patients for whom tumor localization requires bracketing may be at higher risk for positive margins and residual disease and should be counseled accordingly. PMID:26854189

  15. Market Analysis Identifies Community and School Education Goals.

    Science.gov (United States)

    Lindle, Jane C.

    1989-01-01

    Principals must realize the positive effects that marketing can have on improving schools and building support for them. Market analysis forces clarification of the competing needs and interests present in the community. The four marketing phases are needs assessment, analysis, goal setting, and public relations and advertising. (MLH)

  16. Book Review: Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects

    OpenAIRE

    Thomas Nash

    2013-01-01

    Shavers, B. (2013). Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects. Waltham, MA: Elsevier, 290 pages, ISBN-978-1-59749-985-9, US$51.56. Includes bibliographical references and index.Reviewed by Detective Corporal Thomas Nash (), Burlington Vermont Police Department, Internet Crime against Children Task Force. Adjunct Instructor, Champlain College, Burlington VT.In this must read for any aspiring novi...

  17. Techniques for identifying the applicability of new information management technologies in the clinical setting: an example focusing on handheld computers.

    OpenAIRE

    Sittig, D. F.; Jimison, H. B.; Hazlehurst, B. L.; Churchill, B. E.; Lyman, J. A.; Mailhot, M. F.; Quick, E. A.; Simpson, D A

    2000-01-01

    This article describes techniques and strategies used to judge the potential applicability of new information management technologies in the clinical setting and to develop specific design recommendations for new features and services. We focus on a project carried out to identify the potential uses of handheld computers (i.e., the Palm Pilot or a small WinCE-based device) in the ambulatory practice setting. We found that the potential for a robust handheld computing device to positively affe...

  18. Paired cost comparison, a benchmarking technique for identifying areas of cost improvement in environmental restoration projects and waste management activities

    International Nuclear Information System (INIS)

    This paper provides an overview of benchmarking and how the Department of Energy's Office of Environmental Restoration and Waste Management used benchmarking techniques, specifically the Paired Cost Comparison, to identify cost disparities and their causes. The paper includes a discussion of the project categories selected for comparison and the criteria used to select the projects. Results are presented and factors that contribute to cost differences are discussed. Also, conclusions and the application of the Paired Cost Comparison are presented

  19. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  20. Identifying Innovative Interventions to Promote Healthy Eating Using Consumption-Oriented Food Supply Chain Analysis.

    Science.gov (United States)

    Hawkes, Corinna

    2009-07-01

    The mapping and analysis of supply chains is a technique increasingly used to address problems in the food system. Yet such supply chain management has not yet been applied as a means of encouraging healthier diets. Moreover, most policies recommended to promote healthy eating focus on the consumer end of the chain. This article proposes a consumption-oriented food supply chain analysis to identify the changes needed in the food supply chain to create a healthier food environment, measured in terms of food availability, prices, and marketing. Along with established forms of supply chain analysis, the method is informed by a historical overview of how food supply chains have changed over time. The method posits that the actors and actions in the chain are affected by organizational, financial, technological, and policy incentives and disincentives, which can in turn be levered for change. It presents a preliminary example of the supply of Coca-Cola beverages into school vending machines and identifies further potential applications. These include fruit and vegetable supply chains, local food chains, supply chains for health-promoting versions of food products, and identifying financial incentives in supply chains for healthier eating. PMID:23144674

  1. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  2. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  3. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  4. Rice transcriptome analysis to identify possible herbicide quinclorac detoxification genes

    OpenAIRE

    Xu, Wenying; Di, Chao; Zhou, Shaoxia; Liu, Jia; LI Li; Liu, Fengxia; Yang, Xinling; Ling, Yun; Su, Zhen

    2015-01-01

    Quinclorac is a highly selective auxin-type herbicide and is widely used in the effective control of barnyard grass in paddy rice fields, improving the world's rice yield. The herbicide mode of action of quinclorac has been proposed, and hormone interactions affecting quinclorac signaling has been identified. Because of widespread use, quinclorac may be transported outside rice fields with the drainage waters, leading to soil and water pollution and other environmental health problems. In thi...

  5. Identifiability analysis of the CSTR river water quality model.

    Science.gov (United States)

    Chen, J; Deng, Y

    2006-01-01

    Conceptual river water quality models are widely known to lack identifiability. The causes for that can be due to model structure errors, observational errors and less frequent samplings. Although significant efforts have been directed towards better identification of river water quality models, it is not clear whether a given model is structurally identifiable. Information is also limited regarding the contribution of different unidentifiability sources. Taking the widely applied CSTR river water quality model as an example, this paper presents a theoretical proof that the CSTR model is indeed structurally identifiable. Its uncertainty is thus dominantly from observational errors and less frequent samplings. Given the current monitoring accuracy and sampling frequency, the unidentifiability from sampling frequency is found to be more significant than that from observational errors. It is also noted that there is a crucial sampling frequency between 0.1 and 1 day, over which the simulated river system could be represented by different illusions and the model application could be far less reliable.

  6. Association analysis identifies ZNF750 regulatory variants in psoriasis

    Directory of Open Access Journals (Sweden)

    Birnbaum Ramon Y

    2011-12-01

    Full Text Available Abstract Background Mutations in the ZNF750 promoter and coding regions have been previously associated with Mendelian forms of psoriasis and psoriasiform dermatitis. ZNF750 encodes a putative zinc finger transcription factor that is highly expressed in keratinocytes and represents a candidate psoriasis gene. Methods We examined whether ZNF750 variants were associated with psoriasis in a large case-control population. We sequenced the promoter and exon regions of ZNF750 in 716 Caucasian psoriasis cases and 397 Caucasian controls. Results We identified a total of 47 variants, including 38 rare variants of which 35 were novel. Association testing identified two ZNF750 haplotypes associated with psoriasis (p ZNF750 promoter and 5' UTR variants displayed a 35-55% reduction of ZNF750 promoter activity, consistent with the promoter activity reduction seen in a Mendelian psoriasis family with a ZNF750 promoter variant. However, the rare promoter and 5' UTR variants identified in this study did not strictly segregate with the psoriasis phenotype within families. Conclusions Two haplotypes of ZNF750 and rare 5' regulatory variants of ZNF750 were found to be associated with psoriasis. These rare 5' regulatory variants, though not causal, might serve as a genetic modifier of psoriasis.

  7. Using Link Analysis Technique with a Modified Shortest-Path Algorithm to Fight Money Laundering

    Institute of Scientific and Technical Information of China (English)

    CHEN Yunkai; MAI Quanwen; LU Zhengding

    2006-01-01

    Effective link analysis techniques are needed to help law enforcement and intelligence agencies fight money laundering.This paper presents a link analysis technique that uses a modified shortest-path algorithms to identify the strongest association paths between entities in a money laundering network.Based on two-tree Dijkstra and Priority-First-Search (PFS) algorithm, a modified algorithm is presented.To apply the algorithm, a network representation transformation is made first.

  8. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  9. Psychoanalytic technique and 'analysis terminable and interminable'.

    Science.gov (United States)

    Sandler, J

    1988-01-01

    Some of the implications for psychoanalytic technique of the papers given at the plenary sessions of the Montreal Congress are considered. Emphasis is placed on the role of affects in development and in current psychic functioning. Motivation for unconscious wishes arises from many sources, and affects should not only be thought of as drive derivatives. There is a substantial gap between the (largely) implicit clinico-technical theories in the analytic work presented, which do in fact show great sensitivity to the patients' affects, and the formal 'official' general psychoanalytic theory used. This discrepancy in our theories should be faced. Freud's tripartite structural theory of the mind (the 'second topography') seems now to have limitations for clinical purposes. PMID:3063676

  10. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  11. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    Science.gov (United States)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  12. Comparison of Commonly Used Accident Analysis Techniques for Manufacturing Industries

    Directory of Open Access Journals (Sweden)

    IRAJ MOHAMMADFAM

    2015-10-01

    Full Text Available The adverse consequences of major accident events have led to development of accident analysis techniques to investigate thoroughly the accidents. However, each technique has its own advantages and shortcomings,which make it very difficult to find a single technique being capable of analyzing all types of accidents. Therefore, the comparison of accident analysis techniques would help finding out their capabilities in different circumstances to choose the most one. In this research, the techniques CBA and AABF were compared with Tripod β in order to determine the superior technique for analysis of major accidents in manufacturing industries. At first step, the comparison criteria were developed using Delphi Method. Afterwards, the relative importance of each criterion was qualitatively determined and the qualitative values were then converted to the quantitative values  applying  Fuzzy  triangular  numbers.  Finally,  the  TOPSIS  was  used  to  prioritize  the techniques in terms of the preset criteria. The results of the study showed that Tripod β is superior to the CBA and AABF. It is highly recommended to compare all available accident analysis techniques based on proper criteria in order to select the best one whereas improper choice of accident analysis techniques may lead to misguided results.

  13. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  14. Use of fuzzy techniques for analysis of dynamic loads in power systems

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Presents the use of fuzzy techniques for analysis of dynamic load characteristics of power systems to identify the voltage stability (collapse) of a weak bus and concludes from the consistent results obtained that this is a useful tool for analysis of load charactersitics of sophiscated power systems and their components.

  15. Accelerator based techniques for aerosol analysis

    International Nuclear Information System (INIS)

    At the 3 MV Tandetron accelerator of the LABEC laboratory of INFN (Florence, Italy) an external beam facility is fully dedicated to PIXE-PIGE measurements of elemental composition of atmospheric aerosols. Examples regarding recent monitoring campaigns, performed in urban and remote areas, both on a daily basis and with high time resolution, as well as with size selection, will be presented. It will be evidenced how PIXE can provide unique information in aerosol studies or can play a complementary role to traditional chemical analysis. Finally a short presentation of 14C analysis of the atmospheric aerosol by Accelerator Mass Spectrometry (AMS) for the evaluation of the contributions from either fossil fuel combustion or modern sources (wood burning, biogenic activity) will be given. (author)

  16. DNA ANALYSIS OF RICIN USING RAPD TECHNIQUE

    OpenAIRE

    Martin Vivodík; Želmíra Balážová; Zdenka Gálová

    2014-01-01

    Castor (Ricinus communis L.) is an important plant for production of industrial oil. The systematic evaluation of the molecular diversity encompassed in castor inbreds or parental lines offers an efficient means of exploiting the heterosis in castor as well as for management of biodiversity. The aim of this work was to detect genetic variability among the set of 30 castor genotypes using 5 RAPD markers. Amplification of genomic DNA of 30 genotypes, using RAPD analysis, yielded 35 fragments, w...

  17. ANALYSIS AND COMPARATIVE STUDY OF SEARCHING TECHNIQUES

    OpenAIRE

    Yuvraj Singh Chandrawat*

    2015-01-01

    We live in the age of technolgy and it is quiet obvious that it is increasing day-by-day endlessly. In this technical era researchers are focusing on the development of the existing technologies. Software engineering is the dominant branch of Computer Science that deals with the development and analysis of the software. The objective of this study is to analyze and compare the existing searching algorithms (linear search and binary search). In this paper, we will discuss both thes...

  18. Identifying Colluvial Slopes by Airborne LiDAR Analysis

    Science.gov (United States)

    Kasai, M.; Marutani, T.; Yoshida, H.

    2015-12-01

    Colluvial slopes are one of major sources of landslides. Identifying the locations of the slopes will help reduce the risk of disasters, by avoiding building infrastructure and properties nearby, or if they are already there, by applying appropriate counter measures before it suddenly moves. In this study, airborne LiDAR data was analyzed to find their geomorphic characteristics to use for extracting their locations. The study site was set in the suburb of Sapporo City, Hokkaido in Japan. The area is underlain by Andesite and Tuff and prone to landslides. Slope angle and surface roughness were calculated from 5 m resolution DEM. These filters were chosen because colluvial materials deposit at around the angle of repose and accumulation of loose materials was considered to form a peculiar surface texture differentiable from other slope types. Field survey conducted together suggested that colluvial slopes could be identified by the filters with a probability of 80 percent. Repeat LiDAR monitoring of the site by an unmanned helicopter indicated that those slopes detected as colluviums appeared to be moving at a slow rate. In comparison with a similar study from the crushed zone in Japan, the range of slope angle indicative of colluviums agreed with the Sapporo site, while the texture was rougher due to larger debris composing the slopes.

  19. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  20. Asaia bogorensis peritonitis identified by 16S ribosomal RNA sequence analysis in a patient receiving peritoneal dialysis.

    Science.gov (United States)

    Snyder, Richard W; Ruhe, Jorg; Kobrin, Sidney; Wasserstein, Alan; Doline, Christa; Nachamkin, Irving; Lipschutz, Joshua H

    2004-08-01

    Here the authors report a case of refractory peritonitis leading to multiple hospitalizations and the loss of peritoneal dialysis access in a patient on automated peritoneal dialysis, caused by Asaia bogorensis, a bacterium not previously described as a human pathogen. This organism was identified by sequence analysis of the 16S ribosomal RNA gene. Unusual microbial agents may cause peritonitis, and molecular microbiological techniques are important tools for identifying these agents.

  1. Temperature-based Instanton Analysis: Identifying Vulnerability in Transmission Networks

    Energy Technology Data Exchange (ETDEWEB)

    Kersulis, Jonas [Univ. of Michigan, Ann Arbor, MI (United States); Hiskens, Ian [Univ. of Michigan, Ann Arbor, MI (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bienstock, Daniel [Columbia Univ., New York, NY (United States)

    2015-04-08

    A time-coupled instanton method for characterizing transmission network vulnerability to wind generation fluctuation is presented. To extend prior instanton work to multiple-time-step analysis, line constraints are specified in terms of temperature rather than current. An optimization formulation is developed to express the minimum wind forecast deviation such that at least one line is driven to its thermal limit. Results are shown for an IEEE RTS-96 system with several wind-farms.

  2. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  3. Gene expression analysis identifies global gene dosage sensitivity in cancer

    DEFF Research Database (Denmark)

    Fehrmann, Rudolf S. N.; Karjalainen, Juha M.; Krajewska, Malgorzata;

    2015-01-01

    Many cancer-associated somatic copy number alterations (SCNAs) are known. Currently, one of the challenges is to identify the molecular downstream effects of these variants. Although several SCNAs are known to change gene expression levels, it is not clear whether each individual SCNA affects gene...... expression. We reanalyzed 77,840 expression profiles and observed a limited set of 'transcriptional components' that describe well-known biology, explain the vast majority of variation in gene expression and enable us to predict the biological function of genes. On correcting expression profiles...... for these components, we observed that the residual expression levels (in 'functional genomic mRNA' profiling) correlated strongly with copy number. DNA copy number correlated positively with expression levels for 99% of all abundantly expressed human genes, indicating global gene dosage sensitivity. By applying...

  4. Predicting missing links and identifying spurious links via likelihood analysis

    Science.gov (United States)

    Pan, Liming; Zhou, Tao; Lü, Linyuan; Hu, Chin-Kun

    2016-03-01

    Real network data is often incomplete and noisy, where link prediction algorithms and spurious link identification algorithms can be applied. Thus far, it lacks a general method to transform network organizing mechanisms to link prediction algorithms. Here we use an algorithmic framework where a network’s probability is calculated according to a predefined structural Hamiltonian that takes into account the network organizing principles, and a non-observed link is scored by the conditional probability of adding the link to the observed network. Extensive numerical simulations show that the proposed algorithm has remarkably higher accuracy than the state-of-the-art methods in uncovering missing links and identifying spurious links in many complex biological and social networks. Such method also finds applications in exploring the underlying network evolutionary mechanisms.

  5. Book Review: Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects

    Directory of Open Access Journals (Sweden)

    Thomas Nash

    2013-06-01

    Full Text Available Shavers, B. (2013. Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects. Waltham, MA: Elsevier, 290 pages, ISBN-978-1-59749-985-9, US$51.56. Includes bibliographical references and index.Reviewed by Detective Corporal Thomas Nash (tnash@bpdvt.org, Burlington Vermont Police Department, Internet Crime against Children Task Force. Adjunct Instructor, Champlain College, Burlington VT.In this must read for any aspiring novice cybercrime investigator as well as the seasoned professional computer guru alike, Brett Shaver takes the reader into the ever changing and dynamic world of Cybercrime investigation.  Shaver, an experienced criminal investigator, lays out the details and intricacies of a computer related crime investigation in a clear and concise manner in his new easy to read publication, Placing the Suspect behind the Keyboard. Using Digital Forensics and Investigative techniques to Identify Cybercrime Suspects. Shaver takes the reader from start to finish through each step of the investigative process in well organized and easy to follow sections, with real case file examples to reach the ultimate goal of any investigation: identifying the suspect and proving their guilt in the crime. Do not be fooled by the title. This excellent, easily accessible reference is beneficial to both criminal as well as civil investigations and should be in every investigator’s library regardless of their respective criminal or civil investigative responsibilities.(see PDF for full review

  6. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  7. Messina: a novel analysis tool to identify biologically relevant molecules in disease.

    Directory of Open Access Journals (Sweden)

    Mark Pinese

    Full Text Available BACKGROUND: Morphologically similar cancers display heterogeneous patterns of molecular aberrations and follow substantially different clinical courses. This diversity has become the basis for the definition of molecular phenotypes, with significant implications for therapy. Microarray or proteomic expression profiling is conventionally employed to identify disease-associated genes, however, traditional approaches for the analysis of profiling experiments may miss molecular aberrations which define biologically relevant subtypes. METHODOLOGY/PRINCIPAL FINDINGS: Here we present Messina, a method that can identify those genes that only sometimes show aberrant expression in cancer. We demonstrate with simulated data that Messina is highly sensitive and specific when used to identify genes which are aberrantly expressed in only a proportion of cancers, and compare Messina to contemporary analysis techniques. We illustrate Messina by using it to detect the aberrant expression of a gene that may play an important role in pancreatic cancer. CONCLUSIONS/SIGNIFICANCE: Messina allows the detection of genes with profiles typical of markers of molecular subtype, and complements existing methods to assist the identification of such markers. Messina is applicable to any global expression profiling data, and to allow its easy application has been packaged into a freely-available stand-alone software package.

  8. Nuclear fuel lattice performance analysis by data mining techniques

    International Nuclear Information System (INIS)

    Highlights: • This paper shows a data mining application to analyse nuclear fuel lattice designs. • Data mining methods were used to predict if fuel lattices could operate in an adequate way into the BWR reactor core. • Data mining methods learned from fuel lattice datasets simulated with SIMULATE-3. • Results show high recognition percentages of adequate or inadequate fuel lattice performance. - Abstract: In this paper a data mining analysis for BWR nuclear fuel lattice performance is shown. In a typical three-dimensional simulation of the reactor operation simulator gives the core performance for a fuel lattice configuration measured by thermal limits, shutdown margin and produced energy. Based on these results we can determine the number of fulfilled parameters of a fuel lattice configuration. It is interesting to establish a relationship between the fuel lattice properties and the number of fulfilled core parameters in steady state reactor operation. So, with this purpose data mining techniques were used. Results indicate that these techniques are able to predict with enough accuracy (greater than 75%) if a given fuel lattice configuration will have a either “good” or “bad” performance according to reactor core simulation. In this way, they could be coupled with an optimization process to discard fuel lattice configurations with poor performance and, in this way accelerates the optimization process. Data mining techniques apply some filter methods to discard those variables with lower influence in the number of core fulfilled parameter. From this situation, it was also possible to identify a set of variables to be used in new optimization codes with different objective functions than those normally used

  9. High-level power analysis and optimization techniques

    Science.gov (United States)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  10. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  11. Network stratification analysis for identifying function-specific network layers.

    Science.gov (United States)

    Zhang, Chuanchao; Wang, Jiguang; Zhang, Chao; Liu, Juan; Xu, Dong; Chen, Luonan

    2016-04-22

    A major challenge of systems biology is to capture the rewiring of biological functions (e.g. signaling pathways) in a molecular network. To address this problem, we proposed a novel computational framework, namely network stratification analysis (NetSA), to stratify the whole biological network into various function-specific network layers corresponding to particular functions (e.g. KEGG pathways), which transform the network analysis from the gene level to the functional level by integrating expression data, the gene/protein network and gene ontology information altogether. The application of NetSA in yeast and its comparison with a traditional network-partition both suggest that NetSA can more effectively reveal functional implications of network rewiring and extract significant phenotype-related biological processes. Furthermore, for time-series or stage-wise data, the function-specific network layer obtained by NetSA is also shown to be able to characterize the disease progression in a dynamic manner. In particular, when applying NetSA to hepatocellular carcinoma and type 1 diabetes, we can derive functional spectra regarding the progression of the disease, and capture active biological functions (i.e. active pathways) in different disease stages. The additional comparison between NetSA and SPIA illustrates again that NetSA could discover more complete biological functions during disease progression. Overall, NetSA provides a general framework to stratify a network into various layers of function-specific sub-networks, which can not only analyze a biological network on the functional level but also investigate gene rewiring patterns in biological processes. PMID:26879865

  12. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.;

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  13. The use of nominal group technique in identifying community health priorities in Moshi rural district, northern Tanzania

    DEFF Research Database (Denmark)

    Makundi, E A; Manongi, R; Mushi, A K;

    2005-01-01

    This article highlights issues pertaining to identification of community health priorities in a resource poor setting. Community involvement is discussed by drawing experience of involving lay people in identifying priorities in health care through the use of Nominal Group Technique. The identified....... The patients/caregivers, women's group representatives, youth leaders, religious leaders and community leaders/elders constituted the principal subjects. Emphasis was on providing qualitative data, which are of vital consideration in multi-disciplinary oriented studies, and not on quantitative information from...... in the list implying that priorities should not only be focused on diseases, but should also include health services and social cultural issues. Indeed, methods which are easily understood and applied thus able to give results close to those provided by the burden of disease approaches should be adopted...

  14. Analysis and calibration techniques for superconducting resonators

    Science.gov (United States)

    Cataldo, Giuseppe; Wollack, Edward J.; Barrentine, Emily M.; Brown, Ari D.; Moseley, S. Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented.

  15. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically. The effici...

  16. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  17. Meconium microbiome analysis identifies bacteria correlated with premature birth.

    Directory of Open Access Journals (Sweden)

    Alexandria N Ardissone

    Full Text Available Preterm birth is the second leading cause of death in children under the age of five years worldwide, but the etiology of many cases remains enigmatic. The dogma that the fetus resides in a sterile environment is being challenged by recent findings and the question has arisen whether microbes that colonize the fetus may be related to preterm birth. It has been posited that meconium reflects the in-utero microbial environment. In this study, correlations between fetal intestinal bacteria from meconium and gestational age were examined in order to suggest underlying mechanisms that may contribute to preterm birth.Meconium from 52 infants ranging in gestational age from 23 to 41 weeks was collected, the DNA extracted, and 16S rRNA analysis performed. Resulting taxa of microbes were correlated to clinical variables and also compared to previous studies of amniotic fluid and other human microbiome niches.Increased detection of bacterial 16S rRNA in meconium of infants of <33 weeks gestational age was observed. Approximately 61·1% of reads sequenced were classified to genera that have been reported in amniotic fluid. Gestational age had the largest influence on microbial community structure (R = 0·161; p = 0·029, while mode of delivery (C-section versus vaginal delivery had an effect as well (R = 0·100; p = 0·044. Enterobacter, Enterococcus, Lactobacillus, Photorhabdus, and Tannerella, were negatively correlated with gestational age and have been reported to incite inflammatory responses, suggesting a causative role in premature birth.This provides the first evidence to support the hypothesis that the fetal intestinal microbiome derived from swallowed amniotic fluid may be involved in the inflammatory response that leads to premature birth.

  18. Study of the aging processes in polyurethane adhesives using thermal treatment and differential calorimetric, dielectric, and mechanical techniques ; 1, identifying the aging processes ; 2, quantifying the aging effect

    CERN Document Server

    Althouse, L P

    1979-01-01

    Study of the aging processes in polyurethane adhesives using thermal treatment and differential calorimetric, dielectric, and mechanical techniques ; 1, identifying the aging processes ; 2, quantifying the aging effect

  19. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  20. Analysis of Parametric & Non Parametric Classifiers for Classification Technique using WEKA

    Directory of Open Access Journals (Sweden)

    Yugal kumar

    2012-07-01

    Full Text Available In the field of Machine learning & Data Mining, lot of work had been done to construct new classification techniques/ classifiers and lot of research is going on to construct further new classifiers with the help of nature inspired technique such as Genetic Algorithm, Ant Colony Optimization, Bee Colony Optimization, Neural Network, Particle Swarm Optimization etc. Many researchers provided comparative study/ analysis of classification techniques. But this paper deals with another form of analysis of classification techniques i.e. parametric and non parametric classifiers analysis. This paper identifies parametric & non parametric classifiers that are used in classification process and provides tree representation of these classifiers. For the analysis purpose, four classifiers are used in which two of them are parametric and rest of are non-parametric in nature.

  1. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical tec

  2. Confocal Raman data analysis enables identifying apoptosis of MCF-7 cells caused by anticancer drug paclitaxel

    Science.gov (United States)

    Salehi, Hamideh; Middendorp, Elodie; Panayotov, Ivan; Dutilleul, Pierre-Yves Collard; Vegh, Attila-Gergely; Ramakrishnan, Sathish; Gergely, Csilla; Cuisinier, Frederic

    2013-05-01

    Confocal Raman microscopy is a noninvasive, label-free imaging technique used to study apoptosis of live MCF-7 cells. The images are based on Raman spectra of cells components, and their apoptosis is monitored through diffusion of cytochrome c in cytoplasm. K-mean clustering is used to identify mitochondria in cells, and correlation analysis provides the cytochrome c distribution inside the cells. Our results demonstrate that incubation of cells for 3 h with 10 μM of paclitaxel does not induce apoptosis in MCF-7 cells. On the contrary, incubation for 30 min at a higher concentration (100 μM) of paclitaxel induces gradual release of the cytochrome c into the cytoplasm, indicating cell apoptosis via a caspase independent pathway.

  3. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing the dis...

  4. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  5. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  6. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    Science.gov (United States)

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  7. Clinical education and training: Using the nominal group technique in research with radiographers to identify factors affecting quality and capacity

    International Nuclear Information System (INIS)

    There are a number of group-based research techniques available to determine the views or perceptions of individuals in relation to specific topics. This paper reports on one method, the nominal group technique (NGT) which was used to collect the views of important stakeholders on the factors affecting the quality of, and capacity to provide clinical education and training in diagnostic imaging and radiotherapy and oncology departments in the UK. Inclusion criteria were devised to recruit learners, educators, practitioners and service managers to the nominal groups. Eight regional groups comprising a total of 92 individuals were enrolled; the numbers in each group varied between 9 and 13. A total of 131 items (factors) were generated across the groups (mean = 16.4). Each group was then asked to select the top three factors from their original list. Consensus on the important factors amongst groups found that all eight groups agreed on one item: staff attitude, motivation and commitment to learners. The 131 items were organised into themes using content analysis. Five main categories and a number of subcategories emerged. The study concluded that the NGT provided data which were congruent with the issues faced by practitioners and learners in their daily work; this was of vital importance if the findings are to be regarded with credibility. Further advantages and limitations of the method are discussed, however it is argued that the NGT is a useful technique to gather relevant opinion; to select priorities and to reach consensus on a wide range of issues

  8. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  9. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  10. Analysis of Maize Crop Leaf using Multivariate Image Analysis for Identifying Soil Deficiency

    Directory of Open Access Journals (Sweden)

    S. Sridevy

    2014-11-01

    Full Text Available Image processing analysis for the soil deficiency identification has become an active area of research in this study. The changes in the color of the leaves are used to analyze and identify the deficiency of soil nutrients such as Nitrogen (N, Phosphorus (P and potassium (K by digital color image analysis. This research study focuses on the image analysis of the maize crop leaf using multivariate image analysis. In this proposed novel approach, initially, a color transformation for the input RGB image is formed and this RGB is converted to HSV because RGB is ideal for color generation but HSV is very suitable for color perception. Then green pixels are masked and removed using specific threshold value by applying histogram equalization. This masking approach is done through specific customized filtering approach which exclusively filters the green color of the leaf. After the filtering step, only the deficiency part of the leaf is taken for consideration. Then, a histogram generation is carried out for the deficiency part of the leaf. Then, Multivariate Image Analysis approach using Independent Component Analysis (ICA is carried out to extract a reference eigenspace from a matrix built by unfolding color data from the deficiency part. Test images are also unfolded and projected onto the reference eigenspace and the result is a score matrix which is used to compute nutrient deficiency based on the T2 statistic. In addition, a multi-resolution scheme by scaling down process is carried out to speed up the process. Finally, based on the training samples, the soil deficiency is identified based on the color of the maize crop leaf.

  11. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations, 2. Robustness of Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C.; Kleijnen, J.P.C.

    1999-03-24

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked.

  12. Comparative study of Authorship Identification Techniques for Cyber Forensics Analysis

    Directory of Open Access Journals (Sweden)

    Smita Nirkhi

    2013-06-01

    Full Text Available Authorship Identification techniques are used to identify the most appropriate author from group of potential suspects of online messages and find evidences to support the conclusion. Cybercriminals make misuse of online communication for sending blackmail or a spam email and then attempt to hide their true identities to void detection.Authorship Identification of online messages is the contemporary research issue for identity tracing in cyber forensics. This is highly interdisciplinary area as it takes advantage of machine learning, information retrieval, and natural language processing. In this paper, a study of recent techniques and automated approaches to attributing authorship of online messages is presented. The focus of this review study is to summarize all existing authorship identification techniques used in literature to identify authors of online messages. Also it discusses evaluation criteria and parameters for authorship attribution studies and list open questions that will attract future work in this area.

  13. Microcalcifications versus artifacts: initial evaluation of a new ultrasound image processing technique to identify breast microcalcifications in a screening population.

    Science.gov (United States)

    Machado, Priscilla; Eisenbrey, John R; Cavanaugh, Barbara; Forsberg, Flemming

    2014-09-01

    A new commercial image processing technique (MicroPure, Toshiba America Medical Systems, Tustin, CA, USA) that identifies breast microcalcifications was evaluated at the time of patients' annual screening mammograms. Twenty women scheduled for annual screening mammography were enrolled in the study. Patients underwent bilateral outer-upper-quadrant real-time dual gray scale ultrasound and MicroPure imaging using an Aplio XG scanner (Toshiba). MicroPure combines non-linear imaging and speckle suppression to mark suspected calcifications as white spots in a blue overlay image. Four independent and blinded readers analyzed digital clips to determine the presence or absence of microcalcifications and artifacts. The presence of microcalcifications determined by readers was not significantly different from that of mammography (p = 0.57). However, the accuracy was low overall (52%) and also in younger women (<50 years, 54%). In conclusion, although microcalcifications can be identified using MicroPure imaging, this method is not currently appropriate for a screening population and should be used in more focused applications. PMID:25023105

  14. Methylation Linear Discriminant Analysis (MLDA for identifying differentially methylated CpG islands

    Directory of Open Access Journals (Sweden)

    Vass J Keith

    2008-08-01

    Full Text Available Abstract Background Hypermethylation of promoter CpG islands is strongly correlated to transcriptional gene silencing and epigenetic maintenance of the silenced state. As well as its role in tumor development, CpG island methylation contributes to the acquisition of resistance to chemotherapy. Differential Methylation Hybridisation (DMH is one technique used for genome-wide DNA methylation analysis. The study of such microarray data sets should ideally account for the specific biological features of DNA methylation and the non-symmetrical distribution of the ratios of unmethylated and methylated sequences hybridised on the array. We have therefore developed a novel algorithm tailored to this type of data, Methylation Linear Discriminant Analysis (MLDA. Results MLDA was programmed in R (version 2.7.0 and the package is available at CRAN 1. This approach utilizes linear regression models of non-normalised hybridisation data to define methylation status. Log-transformed signal intensities of unmethylated controls on the microarray are used as a reference. The signal intensities of DNA samples digested with methylation sensitive restriction enzymes and mock digested are then transformed to the likelihood of a locus being methylated using this reference. We tested the ability of MLDA to identify loci differentially methylated as analysed by DMH between cisplatin sensitive and resistant ovarian cancer cell lines. MLDA identified 115 differentially methylated loci and 23 out of 26 of these loci have been independently validated by Methylation Specific PCR and/or bisulphite pyrosequencing. Conclusion MLDA has advantages for analyzing methylation data from CpG island microarrays, since there is a clear rational for the definition of methylation status, it uses DMH data without between-group normalisation and is less influenced by cross-hybridisation of loci. The MLDA algorithm successfully identified differentially methylated loci between two classes of

  15. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  16. Microarray Analysis Techniques Singular Value Decomposition and Principal Component Analysis

    CERN Document Server

    Wall, M E; Rocha, L M; Wall, Michael E.; Rechtsteiner, Andreas; Rocha, Luis M.

    2002-01-01

    This chapter describes gene expression analysis by Singular Value Decomposition (SVD), emphasizing initial characterization of the data. We describe SVD methods for visualization of gene expression data, representation of the data using a smaller number of variables, and detection of patterns in noisy gene expression data. In addition, we describe the precise relation between SVD analysis and Principal Component Analysis (PCA) when PCA is calculated using the covariance matrix, enabling our descriptions to apply equally well to either method. Our aim is to provide definitions, interpretations, examples, and references that will serve as resources for understanding and extending the application of SVD and PCA to gene expression analysis.

  17. Dynamic analysis of large structures by modal synthesis techniques.

    Science.gov (United States)

    Hurty, W. C.; Hart, G. C.; Collins, J. D.

    1971-01-01

    Several criteria that may be used to evaluate the merits of some of the existing techniques for the dynamic analysis of large structures which involve division into substructures or components are examined. These techniques make use of component displacement modes to synthetize global systems of generalized coordinates and, for that reason, they have come to be known as modal synthesis or component mode methods. Two techniques have been found to be particularly useful - i.e., the modal synthesis method with fixed attachment modes, and the modal synthesis method with free attachment modes. These two methods are treated in detail, and general flow charts are presented for guidance in computer programming.

  18. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  19. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  20. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  1. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  2. Urine metabolomic analysis identifies potential biomarkers and pathogenic pathways in kidney cancer.

    Science.gov (United States)

    Kim, Kyoungmi; Taylor, Sandra L; Ganti, Sheila; Guo, Lining; Osier, Michael V; Weiss, Robert H

    2011-05-01

    Kidney cancer is the seventh most common cancer in the Western world, its incidence is increasing, and it is frequently metastatic at presentation, at which stage patient survival statistics are grim. In addition, there are no useful biofluid markers for this disease, such that diagnosis is dependent on imaging techniques that are not generally used for screening. In the present study, we use metabolomics techniques to identify metabolites in kidney cancer patients' urine, which appear at different levels (when normalized to account for urine volume and concentration) from the same metabolites in nonkidney cancer patients. We found that quinolinate, 4-hydroxybenzoate, and gentisate are differentially expressed at a false discovery rate of 0.26, and these metabolites are involved in common pathways of specific amino acid and energetic metabolism, consistent with high tumor protein breakdown and utilization, and the Warburg effect. When added to four different (three kidney cancer-derived and one "normal") cell lines, several of the significantly altered metabolites, quinolinate, α-ketoglutarate, and gentisate, showed increased or unchanged cell proliferation that was cell line-dependent. Further evaluation of the global metabolomics analysis, as well as confirmation of the specific potential biomarkers using a larger sample size, will lead to new avenues of kidney cancer diagnosis and therapy. PMID:21348635

  3. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations, 1: Review and Comparison of Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-03-24

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type 11errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples.

  4. Micro analysis of disolved gases by the gas chromatography technique

    International Nuclear Information System (INIS)

    A technique which allows the quantitative analysis of small concentration of disolved gases such as CO2 and H2 in the order of 10-6 - 10-3M is discussed. For the extraction, separation and quantification a Toepler pump was used. This is in tandem to a gas chromatography. This method also can be applied for the analysis of other gases like CO, CH4, CH3-CH3 etc. This technique may be applied in fields such as radiation chemistry, oceanography and environmental studies. (author)

  5. Sample preparation techniques in trace element analysis of water

    Science.gov (United States)

    Nagj, Marina; Jakšić, M.; Orlić, I.; Valković, V.

    1985-06-01

    Sample preparation techniques for the analysis of water for trace elements using X-ray emission spectroscopy are described. Fresh water samples for the analysis of transition metals were prepared by complexation with ammonium-pyrrolidine-dithiocarbamate (APDC) and filtering through a membrane filter. Analyses of water samples for halogenes was done on samples prepared by precipitation with AgNO 3 and subsequent filtration. Two techniques for seawater preparation for uranium determination are described, viz. precipitation with APDC in the presence of iron (II) as a carrier and complexation with APDC followed with adsorption on activated carbon. In all cases trace element levels at 10 -3 μg/g were measured.

  6. Identifying and Prioritizing Effective Factors on Classifying A Private Bank Customers by Delphi Technique and Analytical Hierarchy Process (AHP

    Directory of Open Access Journals (Sweden)

    S. Khayatmoghadam

    2013-05-01

    Full Text Available Banking industry development and presence of different financial institutions cause to increase competition in customer and their capitals attraction so that there are about 28 banks and many credit and financial institutions from which 6 banks are public and 22 banks are private. Among them, public banks have a more appropriate situation than private banks with regard to governmental relations and support and due to geographical expansion and longer history. But due to lack of above conditions; private banks try to attract customers with regarding science areas to remedy this situation. Therefore, in this study we are decided to review banking customers from a different viewpoint. For this reason, we initially obtained ideal indications from banking viewpoint in two-story of uses and resources customers using experts and Delphi technique application which based on this, indicators such as account workflow, account average, lack of returned cheque, etc and in uses section, the amount of facility received, the amount of received warranties, etc, were determined. Then, using a Hierarchical Analysis (AHP method and experts opinions through software Expert Choice11, priority of these criteria were determined and weight of each index was determined. It should be noted that statistical population of bank experts associated with this study were queue and staff. Also obtained results can be used as input for customer grouping in line with CRM techniques implementation.

  7. Applications of Geophysical and Geological Techniques to Identify Areas for Detailed Exploration in Black Mesa Basin, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    George, S.; Reeves, T.K.; Sharma, Bijon; Szpakiewicz, M.

    1999-04-29

    A recent report submitted to the U.S. Department of Energy (DOE) (NIPER/BDM-0226) discussed in considerable detail, the geology, structure, tectonics, and history of oil production activities in the Black Mesa basin in Arizona. As part of the final phase of wrapping up research in the Black Mesa basin, the results of a few additional geophysical studies conducted on structure, stratigraphy, petrophysical analysis, and oil and gas occurrences in the basin are presented here. A second objective of this study is to determine the effectiveness of relatively inexpensive, noninvasive techniques like gravity or magnetic in obtaining information on structure and tectonics in sufficient detail for hydrocarbon exploration, particularly by using the higher resolution satellite data now becoming available to the industry.

  8. Comparative analysis of methods for identifying recurrent copy number alterations in cancer.

    Directory of Open Access Journals (Sweden)

    Xiguo Yuan

    Full Text Available Recurrent copy number alterations (CNAs play an important role in cancer genesis. While a number of computational methods have been proposed for identifying such CNAs, their relative merits remain largely unknown in practice since very few efforts have been focused on comparative analysis of the methods. To facilitate studies of recurrent CNA identification in cancer genome, it is imperative to conduct a comprehensive comparison of performance and limitations among existing methods. In this paper, six representative methods proposed in the latest six years are compared. These include one-stage and two-stage approaches, working with raw intensity ratio data and discretized data respectively. They are based on various techniques such as kernel regression, correlation matrix diagonal segmentation, semi-parametric permutation and cyclic permutation schemes. We explore multiple criteria including type I error rate, detection power, Receiver Operating Characteristics (ROC curve and the area under curve (AUC, and computational complexity, to evaluate performance of the methods under multiple simulation scenarios. We also characterize their abilities on applications to two real datasets obtained from cancers with lung adenocarcinoma and glioblastoma. This comparison study reveals general characteristics of the existing methods for identifying recurrent CNAs, and further provides new insights into their strengths and weaknesses. It is believed helpful to accelerate the development of novel and improved methods.

  9. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  10. Dietary separation of sympatric carnivores identified by molecular analysis of scats.

    Science.gov (United States)

    Farrell, L E; Roman, J; Sunquist, M E

    2000-10-01

    We studied the diets of four sympatric carnivores in the flooding savannas of western Venezuela by analysing predator DNA and prey remains in faeces. DNA was isolated and a portion of the cytochrome b gene of the mitochondrial genome amplified and sequenced from 20 of 34 scats. Species were diagnosed by comparing the resulting sequences to reference sequences generated from the blood of puma (Puma concolor), jaguar (Panthera onca), ocelot (Leopardus pardalus) and crab-eating fox (Cerdocyon thous). Scat size has previously been used to identify predators, but DNA data show that puma and jaguar scats overlap in size, as do those of puma, ocelot and fox. Prey-content analysis suggests minimal prey partitioning between pumas and jaguars. In field testing this technique for large carnivores, two potential limitations emerged: locating intact faecal samples and recovering DNA sequences from samples obtained in the wet season. Nonetheless, this study illustrates the tremendous potential of DNA faecal studies. The presence of domestic dog (Canis familiaris) in one puma scat and of wild pig (Sus scrofa), set as bait, in one jaguar sample exemplifies the forensic possibilities of this noninvasive analysis. In addition to defining the dietary habits of similar size sympatric mammals, DNA identifications from faeces allow wildlife managers to detect the presence of endangered taxa and manage prey for their conservation. PMID:11050553

  11. Review of geographic processing techniques applicable to regional analysis

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, R.C.

    1988-02-01

    Since the early 1970s regional environmental studies have been carried out at the Oak Ridge National Laboratory using computer-assisted techniques. This paper presents an overview of some of these past experiences and the capabilities developed at the Laboratory for processing, analyzing, and displaying geographic data. A variety of technologies have resulted such as computer cartography, image processing, spatial modeling, computer graphics, data base management, and geographic information systems. These tools have been used in a wide range of spatial applications involving facility siting, transportation routing, coal resource analysis, environmental impacts, terrain modeling, inventory development, demographic studies, water resource analyses, etc. The report discusses a number of topics dealing with geographic data bases and structures, software and processing techniques, hardware systems, models and analysis tools, data acquisition techniques, and graphical display methods. Numerous results from many different applications are shown to aid the reader interested in using geographic information systems for environmental analyses. 15 refs., 64 figs., 2 tabs.

  12. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  13. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  14. Automated local bright feature image analysis of nuclear proteindistribution identifies changes in tissue phenotype

    Energy Technology Data Exchange (ETDEWEB)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-02-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues.

  15. Neutron noise analysis techniques in nuclear power reactors

    International Nuclear Information System (INIS)

    The main techniques used in neutron noise analysis of BWR and PWR nuclear reactors are reviewed. Several applications such as control of vibrations in both reactor types, determination of two phase flow parameters in BWR and stability control in BWR are discussed with some detail. The paper contains many experimental results obtained by the main author of this paper. (author)

  16. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  17. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    Science.gov (United States)

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  18. Differential analysis of ovarian and endometrial cancers identifies a methylator phenotype.

    Directory of Open Access Journals (Sweden)

    Diana L Kolbe

    Full Text Available Despite improved outcomes in the past 30 years, less than half of all women diagnosed with epithelial ovarian cancer live five years beyond their diagnosis. Although typically treated as a single disease, epithelial ovarian cancer includes several distinct histological subtypes, such as papillary serous and endometrioid carcinomas. To address whether the morphological differences seen in these carcinomas represent distinct characteristics at the molecular level we analyzed DNA methylation patterns in 11 papillary serous tumors, 9 endometrioid ovarian tumors, 4 normal fallopian tube samples and 6 normal endometrial tissues, plus 8 normal fallopian tube and 4 serous samples from TCGA. For comparison within the endometrioid subtype we added 6 primary uterine endometrioid tumors and 5 endometrioid metastases from uterus to ovary. Data was obtained from 27,578 CpG dinucleotides occurring in or near promoter regions of 14,495 genes. We identified 36 locations with significant increases or decreases in methylation in comparisons of serous tumors and normal fallopian tube samples. Moreover, unsupervised clustering techniques applied to all samples showed three major profiles comprising mostly normal samples, serous tumors, and endometrioid tumors including ovarian, uterine and metastatic origins. The clustering analysis identified 60 differentially methylated sites between the serous group and the normal group. An unrelated set of 25 serous tumors validated the reproducibility of the methylation patterns. In contrast, >1,000 genes were differentially methylated between endometrioid tumors and normal samples. This finding is consistent with a generalized regulatory disruption caused by a methylator phenotype. Through DNA methylation analyses we have identified genes with known roles in ovarian carcinoma etiology, whereas pathway analyses provided biological insight to the role of novel genes. Our finding of differences between serous and endometrioid

  19. An Electrochemical Impedance Spectroscopy-Based Technique to Identify and Quantify Fermentable Sugars in Pineapple Waste Valorization for Bioethanol Production.

    Science.gov (United States)

    Conesa, Claudia; García-Breijo, Eduardo; Loeff, Edwin; Seguí, Lucía; Fito, Pedro; Laguarda-Miró, Nicolás

    2015-01-01

    Electrochemical Impedance Spectroscopy (EIS) has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%). These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring. PMID:26378537

  20. An Electrochemical Impedance Spectroscopy-Based Technique to Identify and Quantify Fermentable Sugars in Pineapple Waste Valorization for Bioethanol Production

    Directory of Open Access Journals (Sweden)

    Claudia Conesa

    2015-09-01

    Full Text Available Electrochemical Impedance Spectroscopy (EIS has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%. These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring.

  1. An Electrochemical Impedance Spectroscopy-Based Technique to Identify and Quantify Fermentable Sugars in Pineapple Waste Valorization for Bioethanol Production

    Science.gov (United States)

    Conesa, Claudia; García-Breijo, Eduardo; Loeff, Edwin; Seguí, Lucía; Fito, Pedro; Laguarda-Miró, Nicolás

    2015-01-01

    Electrochemical Impedance Spectroscopy (EIS) has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%). These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring. PMID:26378537

  2. VIBRATION ANALYSIS ON A COMPOSITE BEAM TO IDENTIFY DAMAGE AND DAMAGE SEVERITY USING FINITE ELEMENT METHOD

    Directory of Open Access Journals (Sweden)

    E.V.V.Ramanamurthy

    2011-07-01

    Full Text Available The objective of this paper is to develop a damage detection method in a composite cantilever beam with an edge crack has been studied using finite element method. A number of analytical, numerical andexperimental techniques are available for the study of damage identification in beams. Studies were carried out for three different types of analysis on a composite cantilever beam with an edge crack as damage. The material used in this analysis is glass-epoxy composite material. The finite element formulation was carried out in the analysis section of the package, known as ANSYS. The types of vibration analysis studied on a composite beam are Modal, Harmonic andTransient analysis. The crack is modeled such that the cantilever beam is replaced with two intact beams with the crack as additional boundary condition. Damage algorithms are used to identify and locate the damage. Damage index method is also used to find the severity of the damage. The results obtained from modal analysis were compared with the transient analysis results.The vibration-based damage detection methods are based on the fact that changes of physical properties (stiffness, mass and damping due to damage will manifest themselves as changes in the structural modal parameters (natural frequencies, mode shapes and modal damping. The task is then to monitor the selected indicators derived from modal parameters to distinguish between undamaged and damaged states. However, the quantitative changes of global modal parameters are not sufficiently sensitive to a local damage. The proposed approach, on the other hand, interprets the dynamic changes caused by damage in a different way. Although the basis for vibration-based damage detection appears intuitive, the implementation in real structures may encounter many significant challenges. The most fundamental issue is the fact that damage typically is a local phenomenon and may not dramatically influence the global dynamic response of a

  3. Developing techniques for cause-responsibility analysis of occupational accidents.

    Science.gov (United States)

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  4. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  5. Microfluidic IEF technique for sequential phosphorylation analysis of protein kinases

    Science.gov (United States)

    Choi, Nakchul; Song, Simon; Choi, Hoseok; Lim, Bu-Taek; Kim, Young-Pil

    2015-11-01

    Sequential phosphorylation of protein kinases play the important role in signal transduction, protein regulation, and metabolism in living cells. The analysis of these phosphorylation cascades will provide new insights into their physiological functions in many biological functions. Unfortunately, the existing methods are limited to analyze the cascade activity. Therefore, we suggest a microfluidic isoelectric focusing technique (μIEF) for the analysis of the cascade activity. Using the technique, we show that the sequential phosphorylation of a peptide by two different kinases can be successfully detected on a microfluidic chip. In addition, the inhibition assay for kinase activity and the analysis on a real sample have also been conducted. The results indicate that μIEF is an excellent means for studies on phosphorylation cascade activity.

  6. Nondestructive analysis of oil shales with PGNAA technique

    International Nuclear Information System (INIS)

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water

  7. Nondestructive analysis of oil shales with PGNAA technique

    Energy Technology Data Exchange (ETDEWEB)

    Maly, J.; Bozorgmanesh, H.

    1984-02-01

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water.

  8. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  9. A Review on Clustering and Outlier Analysis Techniques in Datamining

    Directory of Open Access Journals (Sweden)

    S. Koteeswaran

    2012-01-01

    Full Text Available Problem statement: The modern world is based on using physical, biological and social systems more effectively using advanced computerized techniques. A great amount of data being generated by such systems; it leads to a paradigm shift from classical modeling and analyses based on basic principles to developing models and the corresponding analyses directly from data. The ability to extract useful hidden knowledge in these data and to act on that knowledge is becoming increasingly important in today's competitive world. Approach: The entire process of applying a computer-based methodology, including new techniques, for discovering knowledge from data is called data mining. There are two primary goals in the data mining which are prediction and classification. The larger data involved in the data mining requires clustering and outlier analysis for reducing as well as collecting only useful data set. Results: This study is focusing the review of implementation techniques, recent research on clustering and outlier analysis. Conclusion: The study aims for providing the review of clustering and outlier analysis technique and the discussion on the study will guide the researcher for improving their research direction.

  10. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  11. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  12. Neural network technique for identifying prognostic anomalies from low-frequency electromagnetic signals in the Kuril-Kamchatka region

    Science.gov (United States)

    Popova, I.; Rozhnoi, A.; Solovieva, M.; Levin, B.; Chebrov, V.

    2016-03-01

    In this paper, we suggest a technique for forecasting seismic events based on the very low and low frequency (VLF and LF) signals in the 10 to 50 Hz band using the neural network approach, specifically, the error back-propagation method (EBPM). In this method, the solution of the problem has two main stages: training and recognition (forecasting). The training set is constructed from the combined data, including the amplitudes and phases of the VLF/LF signals measured in the monitoring of the Kuril-Kamchatka region and the corresponding parameters of regional seismicity. Training the neural network establishes the internal relationship between the characteristic changes in the VLF/LF signals a few days before a seismic event and the corresponding level of seismicity. The trained neural network is then applied in a prognostic mode for automated detection of the anomalous changes in the signal which are associated with seismic activity exceeding the assumed threshold level. By the example of several time intervals in 2004, 2005, 2006, and 2007, we demonstrate the efficiency of the neural network approach in the short-term forecasting of earthquakes with magnitudes starting from M ≥ 5.5 from the nighttime variations in the amplitudes and phases of the LF signals on one radio path. We also discuss the results of the simultaneous analysis of the VLF/LF data measured on two partially overlapping paths aimed at revealing the correlations between the nighttime variations in the amplitude of the signal and seismic activity.

  13. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Charlton, William S

    1999-09-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels.

  14. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  15. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  16. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  17. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  18. Analysis of Acoustic Emission Signals using WaveletTransformation Technique

    Directory of Open Access Journals (Sweden)

    S.V. Subba Rao

    2008-07-01

    Full Text Available Acoustic emission (AE monitoring is carried out during proof pressure testing of pressurevessels to find the occurrence of any crack growth-related phenomenon. While carrying out AEmonitoring, it is often found that the background noise is very high. Along with the noise, thesignal includes various phenomena related to crack growth, rubbing of fasteners, leaks, etc. Dueto the presence of noise, it becomes difficult to identify signature of the original signals related to the above phenomenon. Through various filtering/ thresholding techniques, it was found that the original signals were getting filtered out along with noise. Wavelet transformation technique is found to be more appropriate to analyse the AE signals under such situations. Wavelet transformation technique is used to de-noise the AE data. The de-noised signal is classified to identify a signature based on the type of phenomena.Defence Science Journal, 2008, 58(4, pp.559-564, DOI:http://dx.doi.org/10.14429/dsj.58.1677

  19. RCAUSE – A ROOT CAUSE ANALYSIS MODEL TO IDENTIFY THE ROOT CAUSES OF SOFTWARE REENGINEERING PROBLEMS

    Directory of Open Access Journals (Sweden)

    Er. Anand Rajavat

    2011-01-01

    Full Text Available Organizations that wish to modernize their legacy systems, must adopt a financial viable evolution strategy to gratify the needs of modern business environment. There are various options available to modernize legacy system in to more contemporary system. Over the last few years’ legacy system reengineering has emerged as a popular system modernization technique. The reengineering generally focuses on the increased productivity and quality of the system. However many of these efforts are often less than successful because they only concentrate on symptoms of software reengineering risk without targeting root causes of those risk. A subjective assessment (diagnosis of software reengineering risk from different domain of legacy system is required to identify the root causes of those risks. The goal of this paper is to highlight root causes of software reengineering risk. We proposed a root cause analysis model RCause that classify root causes of software reengineering risk in to three distinctive but connected areas of interest i.e. system domain, managerial domain and technical domain. .

  20. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  1. Analysis of questioning technique during classes in medical education

    Directory of Open Access Journals (Sweden)

    Cho Young

    2012-06-01

    Full Text Available Abstract Background Questioning is one of the essential techniques used by lecturers to make lectures more interactive and effective. This study surveyed the perception of questioning techniques by medical school faculty members and analyzed how the questioning technique is used in actual classes. Methods Data on the perceptions of the questioning skills used during lectures was collected using a self‒questionnaire for faculty members (N = 33 during the second semester of 2008. The questionnaire consisted of 18 items covering the awareness and characteristics of questioning skills. Recorded video tapes were used to observe the faculty members’ questioning skills. Results Most faculty members regarded the questioning technique during classes as being important and expected positive outcomes in terms of the students’ participation in class, concentration in class and understanding of the class contents. In the 99 classes analyzed, the median number of questions per class was 1 (0–29. Among them, 40 classes (40.4 % did not use questioning techniques. The frequency of questioning per lecture was similar regardless of the faculty members’ perception. On the other hand, the faculty members perceived that their usual wait time after question was approximately 10 seconds compared to only 2.5 seconds measured from video analysis. More lecture‒experienced faculty members tended to ask more questions in class. Conclusions There were some discrepancies regarding the questioning technique between the faculty members’ perceptions and reality, even though they had positive opinions of the technique. The questioning skills during a lecture need to be emphasized to faculty members.

  2. Analysis of dynamic conflicts by techniques of artificial intelligence

    OpenAIRE

    Shinar, Josef

    1989-01-01

    Dynamic conflicts exhibit differentiel game characteristics and their analysis by any method which disregards this feature may be, by definition, futile. Unfortunately, realistic conflicts may have an intricate information structure and a complex hierarchy which don't fit in the classical differential game formulation. Moreover, in many cases even well formulated differential games are not solvable. In the recent years great progress has been made in artificial intelligence techniques, put in...

  3. Calcium Hardness Analysis of Water Samples Using EDXRF Technique

    Directory of Open Access Journals (Sweden)

    Kanan Deep

    2014-08-01

    Full Text Available Calcium hardness of water samples has been determined using a method based upon the Energy Dispersive X-ray fluorescence (EDXRF technique for elemental analysis. The minimum detection limit for Ca has been found in the range 0.1-100ppm. The experimental approach and analytical method for calcium studies seem satisfactory for the purpose and can be utilized for similar investigations.

  4. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  5. Design and Testing Analysis of Requirement Prioritizations Technique

    Directory of Open Access Journals (Sweden)

    Dinesh Singh

    2015-11-01

    Full Text Available With the growing need of software in our day to day life, the complexity of the software is increasing as well and also the number of requirements associated to the modern software projects. So, in order to overcome the increasing demands and the pressure on the software engineers and program managers to deliver the software to the customers on time and in given budget, there is a huge need to identify the most important requirements and establish their relative importance for implementation according to certain criteria. The existing techniques for requirement prioritization although provide consistent results but are difficult to use and implement. Whereas some existing techniques that are easy to apply lack structure to analyze the complex requirements. Moreover the available techniques lack user friendliness in the prioritization process. So in order to overcome these issues or problems, a hybrid approach of two available techniques was proposed in our earlier work. In this paper we analyzed the design of the proposed system and testing plan of the system. Use case diagram and control flow diagram are used to explain the structure of the approach.

  6. Design and Testing Analysis of Requirement Prioritizations Technique

    Directory of Open Access Journals (Sweden)

    Dinesh Singh

    2014-06-01

    Full Text Available With the growing need of software in our day to day life, the complexity of the software is increasing as well and also the number of requirements associated to the modern software projects. So, in order to overcome the increasing demands and the pressure on the software engineers and program managers to deliver the software to the customers on time and in given budget, there is a huge need to identify the most important requirements and establish their relative importance for implementation according to certain criteria. The existing techniques for requirement prioritization although provide consistent results but are difficult to use and implement. Whereas some existing techniques that are easy to apply lack structure to analyze the complex requirements. Moreover the available techniques lack user friendliness in the prioritization process. So in order to overcome these issues or problems, a hybrid approach of two available techniques was proposed in our earlier work. In this paper we analyzed the design of the proposed system and testing plan of the system. Use case diagram and control flow diagram are used to explain the structure of the approach.

  7. Analysis of diagnostic calorimeter data by the transfer function technique

    Science.gov (United States)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  8. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  9. Identifying At-Risk Students in General Chemistry via Cluster Analysis of Affective Characteristics

    Science.gov (United States)

    Chan, Julia Y. K.; Bauer, Christopher F.

    2014-01-01

    The purpose of this study is to identify academically at-risk students in first-semester general chemistry using affective characteristics via cluster analysis. Through the clustering of six preselected affective variables, three distinct affective groups were identified: low (at-risk), medium, and high. Students in the low affective group…

  10. Identifying Patients Who Are Unsuitable for Accelerated Partial Breast Irradiation Using Three-dimensional External Beam Conformal Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Shikama, Naoto, E-mail: nshikama0525@gmail.com [Department of Radiation Oncology, Saitama Medical University International Medical Center, Saitama (Japan); Nakamura, Naoki; Kunishima, Naoaki; Hatanaka, Shogo; Sekiguchi, Kenji [Department of Radiation Oncology, St. Luke' s International Hospital, Tokyo (Japan)

    2012-07-01

    Purpose: Several recent studies reported that severe late toxicities including soft-tissue fibrosis and fat necrosis are present in patients treated with accelerated partial breast irradiation (APBI) and that these toxicities are associated with the large volume of tissue targeted by high-dose irradiation. The present study was performed to clarify which patients are unsuitable for APBI to avoid late severe toxicities. Methods and Materials: Study subjects comprised 50 consecutive patients with Stage 0-II unilateral breast cancer who underwent breast-conserving surgery, and in whom five or six surgical clips were placed during surgery. All patients were subsequently replanned using three-dimensional conformal radiotherapy (3D-CRT) APBI techniques according to the National Surgical Adjuvant Breast and Bowel Project (NSABP) B-39 and Radiation Therapy Oncology Group (RTOG) 0413 protocol. The beam arrangements included mainly noncoplanar four- or five-field beams using 6-MV photons alone. Results: Dose-volume histogram (DVH) constraints for normal tissues according to the NSABP/RTOG protocol were satisfied in 39 patients (78%). Multivariate analysis revealed that only long craniocaudal clip distance (CCD) was correlated with nonoptimal DVH constraints (p = 0.02), but that pathological T stage, anteroposterior clip distance (APD), site of ipsilateral breast (IB) (right/left), location of the tumor (medial/lateral), and IB reference volume were not. DVH constraints were satisfied in 20% of patients with a long CCD ({>=}5.5 cm) and 92% of those with a short CCD (p < 0.0001). Median IB reference volume receiving {>=}50% of the prescribed dose (IB-V{sub 50}) of all patients was 49.0% (range, 31.4-68.6). Multivariate analysis revealed that only a long CCD was correlated with large IB-V{sub 50} (p < 0.0001), but other factors were not. Conclusion: Patients with long CCDs ({>=}5.5 cm) might be unsuitable for 3D-CRT APBI because of nonoptimal DVH constraints and large IB

  11. Empirical Analysis of Data Mining Techniques for Social Network Websites

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2014-02-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  12. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2015-11-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  13. Statistical techniques to construct assays for identifying likely responders to a treatment under evaluation from cell line genomic data

    Directory of Open Access Journals (Sweden)

    Shi Xiaoyan

    2010-10-01

    Full Text Available Abstract Background Developing the right drugs for the right patients has become a mantra of drug development. In practice, it is very difficult to identify subsets of patients who will respond to a drug under evaluation. Most of the time, no single diagnostic will be available, and more complex decision rules will be required to define a sensitive population, using, for instance, mRNA expression, protein expression or DNA copy number. Moreover, diagnostic development will often begin with in-vitro cell-line data and a high-dimensional exploratory platform, only later to be transferred to a diagnostic assay for use with patient samples. In this manuscript, we present a novel approach to developing robust genomic predictors that are not only capable of generalizing from in-vitro to patient, but are also amenable to clinically validated assays such as qRT-PCR. Methods Using our approach, we constructed a predictor of sensitivity to dacetuzumab, an investigational drug for CD40-expressing malignancies such as lymphoma using genomic measurements of cell lines treated with dacetuzumab. Additionally, we evaluated several state-of-the-art prediction methods by independently pairing the feature selection and classification components of the predictor. In this way, we constructed several predictors that we validated on an independent DLBCL patient dataset. Similar analyses were performed on genomic measurements of breast cancer cell lines and patients to construct a predictor of estrogen receptor (ER status. Results The best dacetuzumab sensitivity predictors involved ten or fewer genes and accurately classified lymphoma patients by their survival and known prognostic subtypes. The best ER status classifiers involved one or two genes and led to accurate ER status predictions more than 85% of the time. The novel method we proposed performed as well or better than other methods evaluated. Conclusions We demonstrated the feasibility of combining feature

  14. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  15. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori...... to analyze the uncertainty of model predictions. This allows judging the fitness of the model to the purpose under uncertainty. Hence we recommend uncertainty analysis as a proactive solution when faced with model uncertainty, which is the case for biofuel process development research....

  16. The potential of electroanalytical techniques in pharmaceutical analysis.

    Science.gov (United States)

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  17. Using Quantitative Data Analysis Techniques for Bankruptcy Risk Estimation for Corporations

    Directory of Open Access Journals (Sweden)

    Ştefan Daniel ARMEANU

    2012-01-01

    Full Text Available Diversification of methods and techniques for quantification and management of risk has led to the development of many mathematical models, a large part of which focused on measuring bankruptcy risk for businesses. In financial analysis there are many indicators which can be used to assess the risk of bankruptcy of enterprises but to make an assessment it is needed to reduce the number of indicators and this can be achieved through principal component, cluster and discriminant analyses techniques. In this context, the article aims to build a scoring function used to identify bankrupt companies, using a sample of companies listed on Bucharest Stock Exchange.

  18. Crime Analysis Using Geoinformatics Technique and Hotspot Detection for Akola City, Maharashtra State, India

    OpenAIRE

    Khadri; S.F.R; Chaitanya Pande; Kanak Moharir

    2013-01-01

    The need of effective utilization geoinformatics technique has been providing city safety kind with tools to analyze and interpret such relations through GIS software. Recently, there has been increase in crimes of various types in Akola city. To prepare Maps offer crime analysis and graphic representations of crime-related issues. An understanding of where crimes occur can improve attempts to fight crime. The present study identified various crime patterns in Akola city and covers aspect of ...

  19. Biomechanical energetic analysis of technique during learning the longswing on the high bar.

    Science.gov (United States)

    Williams, Genevieve Kate Roscoe; Irwin, Gareth; Kerwin, David George; Newell, Karl Maxim

    2015-01-01

    Biomechanical energetic analysis of technique can be performed to identify limits or constraints to performance outcome at the level of joint work, and to assess the mechanical efficiency of techniques. The aim of this study was to investigate the biomechanical energetic processes during learning the longswing on the high bar. Twelve male, novice participants took part in a training study. Kinematic and kinetics data were collected during swing attempts in eight weekly testing sessions. Inverse dynamics analysis was performed from known zero forces at the toes. Joint work, total energy, and bar energy were calculated. Biomechanical constraints to action, that is, limits to novice performance, were identified as "total work" and "shoulder work". The most biomechanically efficient technique was associated with an onset of the hip functional phase and joint work that occurred between 10-45° before the bottom of the swing. The learning of gross motor skills is realised through the establishment of a set of techniques with task specific biomechanical constraints. Knowledge of the biomechanical constraints to action associated with more effective and efficient techniques will be useful for both assessing learning and establishing effective learning interventions. PMID:25535648

  20. Network analysis of translocated Takahe populations to identify disease surveillance targets.

    Science.gov (United States)

    Grange, Zoë L; VAN Andel, Mary; French, Nigel P; Gartrell, Brett D

    2014-04-01

    Social network analysis is being increasingly used in epidemiology and disease modeling in humans, domestic animals, and wildlife. We investigated this tool in describing a translocation network (area that allows movement of animals between geographically isolated locations) used for the conservation of an endangered flightless rail, the Takahe (Porphyrio hochstetteri). We collated records of Takahe translocations within New Zealand and used social network principles to describe the connectivity of the translocation network. That is, networks were constructed and analyzed using adjacency matrices with values based on the tie weights between nodes. Five annual network matrices were created using the Takahe data set, each incremental year included records of previous years. Weights of movements between connected locations were assigned by the number of Takahe moved. We calculated the number of nodes (i(total)) and the number of ties (t(total)) between the nodes. To quantify the small-world character of the networks, we compared the real networks to random graphs of the equivalent size, weighting, and node strength. Descriptive analysis of cumulative annual Takahe movement networks involved determination of node-level characteristics, including centrality descriptors of relevance to disease modeling such as weighted measures of in degree (k(i)(in)), out degree (k(i)(out)), and betweenness (B(i)). Key players were assigned according to the highest node measure of k(i)(in), k(i)(out), and B(i) per network. Networks increased in size throughout the time frame considered. The network had some degree small-world characteristics. Nodes with the highest cumulative tie weights connecting them were the captive breeding center, the Murchison Mountains and 2 offshore islands. The key player fluctuated between the captive breeding center and the Murchison Mountains. The cumulative networks identified the captive breeding center every year as the hub of the network until the final

  1. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist;

    2015-01-01

    structure evaluation by assessing the local identifiability characteristics of the parameters. Moreover, such a procedure should be generic to make sure it can be applied independent from the structure of the model. We hereby apply a numerical identifiability approach which is based on the work of Walter...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring....... In contrast, the practical identifiability analysis revealed that high values of the forward rate parameter Vf led to identifiability problems. These problems were even more pronounced athigher substrate concentrations, which illustrates the importance of a proper experimental designto avoid...

  2. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    Directory of Open Access Journals (Sweden)

    Akshay Amolik

    2015-12-01

    Full Text Available Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment analysis is tricky as compared to broad sentiment analysis because of the slang words and misspellings and repeated characters. We know that the maximum length of each tweet in Twitter is 140 characters. So it is very important to identify correct sentiment of each word. In our project we are proposing a highly accurate model of sentiment analysis of tweets with respect to latest reviews of upcoming Bollywood or Hollywood movies. With the help of feature vector and classifiers such as Support vector machine and Naïve Bayes, we are correctly classifying these tweets as positive, negative and neutral to give sentiment of each tweet.

  3. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  4. Requirements Analyses Integrating Goals and Problem Analysis Techniques

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    One of the difficulties that goal-oriented requirements analyses encounters is that the efficiency of the goal refinement is based on the analysts' subjective knowledge and experience. To improve the efficiency of the requirements eiicitation process, engineers need approaches with more systemized analysis techniques. This paper integrates the goal-oriented requirements language i* with concepts from a structured problem analysis notation, problem frames (PF). The PF approach analyzes software design as a contextualized problem which has to respond to constraints imposed by the environment. The proposed approach is illustrated using the meeting scheduler exemplar. Results show that integration of the goal and the problem analysis enables simultaneous consideration of the designer's subjective intentions and the physical environmental constraints.

  5. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  6. Consolidating metabolite identifiers to enable contextual and multi-platform metabolomics data analysis

    Directory of Open Access Journals (Sweden)

    Saito Kazuki

    2010-04-01

    Full Text Available Abstract Background Analysis of data from high-throughput experiments depends on the availability of well-structured data that describe the assayed biomolecules. Procedures for obtaining and organizing such meta-data on genes, transcripts and proteins have been streamlined in many data analysis packages, but are still lacking for metabolites. Chemical identifiers are notoriously incoherent, encompassing a wide range of different referencing schemes with varying scope and coverage. Online chemical databases use multiple types of identifiers in parallel but lack a common primary key for reliable database consolidation. Connecting identifiers of analytes found in experimental data with the identifiers of their parent metabolites in public databases can therefore be very laborious. Results Here we present a strategy and a software tool for integrating metabolite identifiers from local reference libraries and public databases that do not depend on a single common primary identifier. The program constructs groups of interconnected identifiers of analytes and metabolites to obtain a local metabolite-centric SQLite database. The created database can be used to map in-house identifiers and synonyms to external resources such as the KEGG database. New identifiers can be imported and directly integrated with existing data. Queries can be performed in a flexible way, both from the command line and from the statistical programming environment R, to obtain data set tailored identifier mappings. Conclusions Efficient cross-referencing of metabolite identifiers is a key technology for metabolomics data analysis. We provide a practical and flexible solution to this task and an open-source program, the metabolite masking tool (MetMask, available at http://metmask.sourceforge.net, that implements our ideas.

  7. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  8. Comparative Analysis of Partial Occlusion Using Face Recognition Techniques

    Directory of Open Access Journals (Sweden)

    N.Nallammal

    2013-04-01

    Full Text Available This paper presents a comparison of partial occlusion using face recognition techniques that gives in which technique produce better result for total success rate. The partial occlusion of face recognition is especially useful for people where part of their face is scarred and defect thus need to be covered. Hence, either top part/eye region or bottom part of face will be recognized respectively. The partial face information are tested with Principle Component Analysis (PCA, Non-negative matrix factorization (NMF, Local NMF (LNMF and Spatially Confined NMF (SFNMF. The comparative results show that the recognition rate of 95.17% with r = 80 by using SFNMF for bottom face region. On the other hand, eye region achieves 95.12% with r = 10 by using LNMF.

  9. Application of thermal analysis techniques in activated carbon production

    Science.gov (United States)

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  10. Gamma absorption technique in elemental analysis of composite materials

    International Nuclear Information System (INIS)

    Highlights: ► Application of gamma-ray absorption technique in elemental analysis. ► Determination of elemental composition of some bronze and gold alloys. ► Determination of some heavy elements in water. - Abstract: Expressions for calculating the elemental concentrations of composite materials based on a gamma absorption technique are derived. These expressions provide quantitative information about elemental concentrations of materials. Calculations are carried out for estimating the concentrations of copper and gold in some alloys of bronze and gold. The method was also applied for estimating the concentrations of some heavy elements in a water matrix highlighting the differences with photon attenuation measurements. Theoretical mass attenuation coefficient values were obtained using the WinXCom program. A high-resolution gamma-ray spectrometry based on high purity germanium detector (HPGe) was employed to measure the attenuation of a strongly collimated monoenergetic gamma beam through samples.

  11. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  12. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author)

  13. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  14. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  15. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  16. Domain-driven specification techniques simplify the analysis of requirements for the KAON factory central control system

    Energy Technology Data Exchange (ETDEWEB)

    Inwood, C. (Inwood Real-Time Systems Associates, Kinburn, ON (Canada)); Ludgate, G.A.; Dohan, D.A.; Osberg, E.A.; Koscielniak, S. (British Columbia Univ., Vancouver (Canada). TRIUMF Facility)

    1990-08-01

    Domain-driven modelling, outlined in this paper, has been successfully applied to the analysis, specification and design of the KAON Factory central control system (KF-CCS). This advanced object-oriented technique is especially suited to the development of complex systems. Early in the project, four very natural domains were identified which simplified the analysis of requirements. (orig.).

  17. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Science.gov (United States)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  18. Fourier transform infrared spectroscopy techniques for the analysis of drugs of abuse

    Science.gov (United States)

    Kalasinsky, Kathryn S.; Levine, Barry K.; Smith, Michael L.; Magluilo, Joseph J.; Schaefer, Teresa

    1994-01-01

    Cryogenic deposition techniques for Gas Chromatography/Fourier Transform Infrared (GC/FT-IR) can be successfully employed in urinalysis for drugs of abuse with detection limits comparable to those of the established Gas Chromatography/Mass Spectrometry (GC/MS) technique. The additional confidence of the data that infrared analysis can offer has been helpful in identifying ambiguous results, particularly, in the case of amphetamines where drugs of abuse can be confused with over-the-counter medications or naturally occurring amines. Hair analysis has been important in drug testing when adulteration of urine samples has been a question. Functional group mapping can further assist the analysis and track drug use versus time.

  19. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  20. New technique for high-speed microjet breakup analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vago, N. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland); Spiegel, A. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Couty, P. [Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, Lausanne, BM, 1015, Lausanne (Switzerland); Wagner, F.R.; Richerzhagen, B. [Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland)

    2003-10-01

    In this paper we introduce a new technique for visualizing the breakup of thin high-speed liquid jets. Focused light of a He-Ne laser is coupled into a water jet, which behaves as a cylindrical waveguide until the point where the amplitude of surface waves is large enough to scatter out the light from the jet. Observing the jet from a direction perpendicular to its axis, the light that appears indicates the location of breakup. Real-time examination and also statistical analysis of the jet disruption is possible with this method. A ray tracing method was developed to demonstrate the light scattering process. (orig.)

  1. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  2. Multi-element study in aluminium by activation analysis technique

    International Nuclear Information System (INIS)

    The instrumental activation analysis is a technique relatively quickly that help to know the elemental composition of materials. It is used mainly in the trace elements determination but in the case of major elements it is necessary to make some considerations as the different nuclear reactions carried out due to the neutron flux is a mixture of thermal and fast neutrons. This could be interpreted for the presence and or erroneous quantification about some elements. In this work, is described the way in which was analyzed a container piece with approximately a 85% of aluminium. The elements Zn, Mn, Sb, Ga, Cu, Cl and Sm were determined. (Author)

  3. Acceleration of multivariate analysis techniques in TMVA using GPUs

    CERN Document Server

    Hoecker, A; Therhaag, J; Washbrook, A

    2012-01-01

    A feasibility study into the acceleration of multivariate analysis techniques using Graphics Processing Units (GPUs) will be presented. The MLP-based Artificial Neural Network method contained in the TMVA framework has been chosen as a focus for investigation. It was found that the network training time on a GPU was lower than for CPU execution as the complexity of the network was increased. In addition, multiple neural networks can be trained simultaneously on a GPU within the same time taken for single network training on a CPU. This could be potentially leveraged to provide a qualitative performance gain in data classification.

  4. Potential Coastal Pumped Hydroelectric Energy Storage Locations Identified using GIS-based Topographic Analysis

    Science.gov (United States)

    Parsons, R.; Barnhart, C. J.; Benson, S. M.

    2013-12-01

    Large-scale electrical energy storage could accommodate variable, weather dependent energy resources such as wind and solar. Pumped hydroelectric energy storage (PHS) and compressed energy storage area (CAES) have life cycle energy and financial costs that are an order of magnitude lower than conventional electrochemical storage technologies. However PHS and CAES storage technologies require specific geologic conditions. Conventional PHS requires an upper and lower reservoir separated by at least 100 m of head, but no more than 10 km in horizontal distance. Conventional PHS also impacts fresh water supplies, riparian ecosystems, and hydrologic environments. A PHS facility that uses the ocean as the lower reservoir benefits from a smaller footprint, minimal freshwater impact, and the potential to be located near off shore wind resources and population centers. Although technologically nascent, today one coastal PHS facility exists. The storage potential for coastal PHS is unknown. Can coastal PHS play a significant role in augmenting future power grids with a high faction of renewable energy supply? In this study we employ GIS-based topographic analysis to quantify the coastal PHS potential of several geographic locations, including California, Chile and Peru. We developed automated techniques that seek local topographic minima in 90 m spatial resolution shuttle radar topography mission (SRTM) digital elevation models (DEM) that satisfy the following criteria conducive to PHS: within 10 km from the sea; minimum elevation 150 m; maximum elevation 1000 m. Preliminary results suggest the global potential for coastal PHS could be very significant. For example, in northern Chile we have identified over 60 locations that satisfy the above criteria. Two of these locations could store over 10 million cubic meters of water or several GWh of energy. We plan to report a global database of candidate coastal PHS locations and to estimate their energy storage capacity.

  5. Transcriptome Analysis of Syringa oblata Lindl. Inflorescence Identifies Genes Associated with Pigment Biosynthesis and Scent Metabolism.

    Directory of Open Access Journals (Sweden)

    Jian Zheng

    Full Text Available Syringa oblata Lindl. is a woody ornamental plant with high economic value and characteristics that include early flowering, multiple flower colors, and strong fragrance. Despite a long history of cultivation, the genetics and molecular biology of S. oblata are poorly understood. Transcriptome and expression profiling data are needed to identify genes and to better understand the biological mechanisms of floral pigments and scents in this species. Nine cDNA libraries were obtained from three replicates of three developmental stages: inflorescence with enlarged flower buds not protruded, inflorescence with corolla lobes not displayed, and inflorescence with flowers fully opened and emitting strong fragrance. Using the Illumina RNA-Seq technique, 319,425,972 clean reads were obtained and were assembled into 104,691 final unigenes (average length of 853 bp, 41.75% of which were annotated in the NCBI non-redundant protein database. Among the annotated unigenes, 36,967 were assigned to gene ontology categories and 19,956 were assigned to eukaryoticorthologous groups. Using the Kyoto Encyclopedia of Genes and Genomes pathway database, 12,388 unigenes were sorted into 286 pathways. Based on these transcriptomic data, we obtained a large number of candidate genes that were differentially expressed at different flower stages and that were related to floral pigment biosynthesis and fragrance metabolism. This comprehensive transcriptomic analysis provides fundamental information on the genes and pathways involved in flower secondary metabolism and development in S. oblata, providing a useful database for further research on S. oblata and other plants of genus Syringa.

  6. Friction force microscopy: a simple technique for identifying graphene on rough substrates and mapping the orientation of graphene grains on copper

    OpenAIRE

    Marsden, Alexander J.; Phillips, Mick; Wilson, Neil R.

    2013-01-01

    At a single atom thick, it is challenging to distinguish graphene from its substrate using conventional techniques. In this paper we show that friction force microscopy (FFM) is a simple and quick technique for identifying graphene on a range of samples, from growth substrates to rough insulators. We show that FFM is particularly effective for characterising graphene grown on copper where it can correlate the graphene growth to the three-dimensional surface topography and map the crystallogra...

  7. Recovering prehistoric woodworking skills using spatial analysis techniques

    Science.gov (United States)

    Kovács, K.; Hanke, K.

    2015-08-01

    Recovering of ancient woodworking skills can be achieved by the simultaneous documentation and analysis of the tangible evidences such as the geometry parameters of prehistoric hand tools or the fine morphological characteristics of well preserved wooden archaeological finds. During this study, altogether 10 different hand tool forms and over 60 hand tool impressions were investigated for the better understanding of the Bronze Age woodworking efficiency. Two archaeological experiments were also designed in this methodology and unknown prehistoric adzes could be reconstructed by the results of these studies and by the spatial analysis of the Bronze Age tool marks. Finally, the trimming efficiency of these objects were also implied and these woodworking skills could be quantified in the case of a Bronze Age wooden construction from Austria. The proposed GIS-based tool mark segmentation and comparison can offer an objective, user-independent technique for the related intangible heritage interpretations in the future.

  8. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  9. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  10. Metabolic Engineering: Techniques for analysis of targets for genetic manipulations

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1998-01-01

    at different operating conditions, and the application of metabolic engineering to process optimization is, therefore, expected mainly to have an impact on the improvement of processes where yield, productivity, and titer are important design factors, i.e., in the production of metabolites and industrial...... enzymes. Despite the prospect of obtaining major improvement through metabolic engineering, this approach is, however, not expected to completely replace the classical approach to strain improvement-random mutagenesis followed by screening. Identification of the optimal genetic changes for improvement...... analysis of pathways, and (5) kinetic modeling. In this article, these different techniques are discussed and their applications to the analysis of different processes are illustrated. (C) 1998 John Wiley & Sons, Inc....

  11. Comparative analysis of Salmonella genomes identifies a metabolic network for escalating growth in the inflamed gut.

    Science.gov (United States)

    Nuccio, Sean-Paul; Bäumler, Andreas J

    2014-03-18

    The Salmonella genus comprises a group of pathogens associated with illnesses ranging from gastroenteritis to typhoid fever. We performed an in silico analysis of comparatively reannotated Salmonella genomes to identify genomic signatures indicative of disease potential. By removing numerous annotation inconsistencies and inaccuracies, the process of reannotation identified a network of 469 genes involved in central anaerobic metabolism, which was intact in genomes of gastrointestinal pathogens but degrading in genomes of extraintestinal pathogens. This large network contained pathways that enable gastrointestinal pathogens to utilize inflammation-derived nutrients as well as many of the biochemical reactions used for the enrichment and biochemical discrimination of Salmonella serovars. Thus, comparative genome analysis identifies a metabolic network that provides clues about the strategies for nutrient acquisition and utilization that are characteristic of gastrointestinal pathogens. IMPORTANCE While some Salmonella serovars cause infections that remain localized to the gut, others disseminate throughout the body. Here, we compared Salmonella genomes to identify characteristics that distinguish gastrointestinal from extraintestinal pathogens. We identified a large metabolic network that is functional in gastrointestinal pathogens but decaying in extraintestinal pathogens. While taxonomists have used traits from this network empirically for many decades for the enrichment and biochemical discrimination of Salmonella serovars, our findings suggest that it is part of a "business plan" for growth in the inflamed gastrointestinal tract. By identifying a large metabolic network characteristic of Salmonella serovars associated with gastroenteritis, our in silico analysis provides a blueprint for potential strategies to utilize inflammation-derived nutrients and edge out competing gut microbes.

  12. Application of the INAA technique for elemental analysis of metallic biomaterials used in dentistry

    Energy Technology Data Exchange (ETDEWEB)

    Cincu, Em [' Horia Hulubei' National Institute for Research and Development in Physics and Nuclear Engineering (IFIN-HH), Bucharest-Magurele, 407 Atomistilor Street, P. O. Box MG-6, Bucharest 077125 (Romania)], E-mail: cincue@nipne.ro; Craciun, L.; Manea-Grigore, Ioana; Cazan, I.L.; Manu, V. [' Horia Hulubei' National Institute for Research and Development in Physics and Nuclear Engineering (IFIN-HH), Bucharest-Magurele, 407 Atomistilor Street, P. O. Box MG-6, Bucharest 077125 (Romania); Barbos, D. [Institute for Nuclear Research (INR) Mioveni, 1Campului Street, P. O. Box 78, Bucharest 115400 (Romania); Cocis, A. [Dental Surgery Clinic PANA-DANIELA, Bucharest, 6 Intrarea Buzesti Street (Romania)

    2009-12-15

    The sensitive nuclear analytical technique Instrumental Neutron Activation Analysis (INAA) has been applied on several types of metallic biomaterials (Heraenium CE, Ventura Nibon, Wiron 99 and Ducinox which are currently used for restoration in the dental clinics) to study its performance in elemental analysis and identify eventual limitations. The investigation has been performed by two NAA Laboratories and aimed at getting an answer to the question on how the biomaterials compositions influence the patients' health over the course of time, taking into account the EC Directive 94/27/EC recommendations concerning Ni toxicity.

  13. Validation of Design and Analysis Techniques of Tailored Composite Structures

    Science.gov (United States)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  14. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Directory of Open Access Journals (Sweden)

    Amin Torabipour

    2014-11-01

    Full Text Available This study aimed to measure the hospital productivity using data envelopment analysis (DEA technique and Malmquist indices.This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software.Six hospitals (50% had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05 (except in 2009 years.Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  15. Use of statistical techniques in analysis of biological data

    Directory of Open Access Journals (Sweden)

    Farzana Perveen

    2012-07-01

    Full Text Available Starting from the ancient age to the modern times not a single area can be found where statistics is not playing a vital role. Statistics has now been recognized and universally accepted as an essential component of research in every branch of science. Starting from agriculture, biology, education, economics, business, management, medical, engineering, psychology, environment and space, statistics is playing significant role. Statistics is being extensively used in biological sciences. Specifically, biostatistics is the branch of applied statistics that concerns the application of statistical methods to medical, genetics and biological problems. In the sequel, one important step is the appropriate and careful analysis of statistical data to get precise results. It is pertinent to mention that majority of statistical tests and techniques are applied under certain mathematical assumptions. Therefore, it is necessary to realize the importance of relevant assumptions. In this connection, among other assumptions, the assumption of normality (normal distribution of population(s and variance homogeneity etc. are the most important. If these assumptions are not satisfied, the results may be potentially misleading. It is, therefore, suggested to check the relevant assumption(s about the data before applying statistical test(s to get valid results. In this study, a few techniques/tests have been described for checking the normality of a given set of data. Since the Analysis of variance (ANOVA models are extensively used in biological research, therefore, the assumptions underlying the ANOVA have also been discussed. Non-parametric statistics is also described to some extent.

  16. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    Science.gov (United States)

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  17. Homogenization techniques for the analysis of porous SMA

    Science.gov (United States)

    Sepe, V.; Auricchio, F.; Marfia, S.; Sacco, E.

    2016-05-01

    In this paper the mechanical response of porous Shape Memory Alloy (SMA) is modeled. The porous SMA is considered as a composite medium made of a dense SMA matrix with voids treated as inclusions. The overall response of this very special composite is deduced performing a micromechanical and homogenization analysis. In particular, the incremental Mori-Tanaka averaging scheme is provided; then, the Transformation Field Analysis procedure in its uniform and nonuniform approaches, UTFA and NUTFA respectively, are presented. In particular, the extension of the NUTFA technique proposed by Sepe et al. (Int J Solids Struct 50:725-742, 2013) is presented to investigate the response of porous SMA characterized by closed and open porosity. A detailed comparison between the outcomes provided by the Mori-Tanaka, the UTFA and the proposed NUTFA procedures for porous SMA is presented, through numerical examples for two- and three-dimensional problems. In particular, several values of porosity and different loading conditions, inducing pseudoelastic effect in the SMA matrix, are investigated. The predictions assessed by the Mori-Tanaka, the UTFA and the NUTFA techniques are compared with the results obtained by nonlinear finite element analyses. A comparison with experimental data available in literature is also presented.

  18. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  19. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  20. Radio & Optical Interferometry: Basic Observing Techniques and Data Analysis

    CERN Document Server

    Monnier, John D

    2012-01-01

    Astronomers usually need the highest angular resolution possible, but the blurring effect of diffraction imposes a fundamental limit on the image quality from any single telescope. Interferometry allows light collected at widely-separated telescopes to be combined in order to synthesize an aperture much larger than an individual telescope thereby improving angular resolution by orders of magnitude. Radio and millimeter wave astronomers depend on interferometry to achieve image quality on par with conventional visible and infrared telescopes. Interferometers at visible and infrared wavelengths extend angular resolution below the milli-arcsecond level to open up unique research areas in imaging stellar surfaces and circumstellar environments. In this chapter the basic principles of interferometry are reviewed with an emphasis on the common features for radio and optical observing. While many techniques are common to interferometers of all wavelengths, crucial differences are identified that will help new practi...

  1. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    Energy Technology Data Exchange (ETDEWEB)

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  2. Practice patterns in FNA technique: A survey analysis

    Institute of Scientific and Technical Information of China (English)

    Christopher; J; DiMaio; Jonathan; M; Buscaglia; Seth; A; Gross; Harry; R; Aslanian; Adam; J; Goodman; Sammy; Ho; Michelle; K; Kim; Shireen; Pais; Felice; Schnoll-Sussman; Amrita; Sethi; Uzma; D; Siddiqui; David; H; Robbins; Douglas; G; Adler; Satish; Nagula

    2014-01-01

    AIM: To ascertain fine needle aspiration(FNA) tech-niques by endosonographers with varying levels of ex-perience and environments.METHODS: A survey study was performed on United States based endosonographers. The subjects complet-ed an anonymous online electronic survey. The main outcome measurements were differences in needle choice, FNA technique, and clinical decision making among endosonographers and how this relates to years in practice, volume of EUS-FNA procedures, and prac-tice environment.RESULTS: A total of 210(30.8%) endosonographers completed the survey. Just over half(51.4%) identified themselves as academic/university-based practitioners. The vast majority of respondents(77.1%) identified themselves as high-volume endoscopic ultrasound(EUS)(> 150 EUS/year) and high-volume FNA(> 75 FNA/year) performers(73.3). If final cytology is non-diagnostic, high-volume EUS physicians were more likely than low volume physicians to repeat FNA with a core needle(60.5% vs 31.2%; P = 0.0004), and low volume physicians were more likely to refer patients for either surgical or percutaneous biopsy,(33.4% vs 4.9%, P < 0.0001). Academic physicians were more likely to repeat FNA with a core needle(66.7%) compared to community physicians(40.2%, P < 0.001). CONCLUSION: There is significant variation in EUS-FNA practices among United States endosonographers. Differences appear to be related to EUS volume and practice environment.

  3. Neutron activation analysis techniques for identifying elemental status in Alzheimer's disease

    International Nuclear Information System (INIS)

    Brain tissue (hippocampus and cerebral cortex) from Alzheimer's disease and control individuals sampled from Eastern Canada and the United Kingdom were analyzed for Ag, Al, As, B, Br, Ca, Cd, Co, Cr, Cs, Cu, Fe, Hg, I, K, La, Mg, Mn, Mo, Ni, Rb, S, Sb, Sc, Se, Si, Sn, Sr, Ti, V and Zn. NAA (thermal and prompt gamma-ray) methods were used. Highly significant differences (probability less than 0.005) for both study areas were shown between Alzheimer's disease and control individuals. No statistical evidence of aluminium accumulation with age was noted. Possible zinc dificiency was observed. (author) 21 refs.; 5 tables

  4. Application of techniques to identify coal-mine and power-generation effects on surface-water quality, San Juan River basin, New Mexico and Colorado

    Science.gov (United States)

    Goetz, C.L.; Abeyta, Cynthia G.; Thomas, E.V.

    1987-01-01

    Numerous analytical techniques were applied to determine water quality changes in the San Juan River basin upstream of Shiprock , New Mexico. Eight techniques were used to analyze hydrologic data such as: precipitation, water quality, and streamflow. The eight methods used are: (1) Piper diagram, (2) time-series plot, (3) frequency distribution, (4) box-and-whisker plot, (5) seasonal Kendall test, (6) Wilcoxon rank-sum test, (7) SEASRS procedure, and (8) analysis of flow adjusted, specific conductance data and smoothing. Post-1963 changes in dissolved solids concentration, dissolved potassium concentration, specific conductance, suspended sediment concentration, or suspended sediment load in the San Juan River downstream from the surface coal mines were examined to determine if coal mining was having an effect on the quality of surface water. None of the analytical methods used to analyzed the data showed any increase in dissolved solids concentration, dissolved potassium concentration, or specific conductance in the river downstream from the mines; some of the analytical methods used showed a decrease in dissolved solids concentration and specific conductance. Chaco River, an ephemeral stream tributary to the San Juan River, undergoes changes in water quality due to effluent from a power generation facility. The discharge in the Chaco River contributes about 1.9% of the average annual discharge at the downstream station, San Juan River at Shiprock, NM. The changes in water quality detected at the Chaco River station were not detected at the downstream Shiprock station. It was not possible, with the available data, to identify any effects of the surface coal mines on water quality that were separable from those of urbanization, agriculture, and other cultural and natural changes. In order to determine the specific causes of changes in water quality, it would be necessary to collect additional data at strategically located stations. (Author 's abstract)

  5. Identifying disease candidate genes via large-scale gene network analysis.

    Science.gov (United States)

    Kim, Haseong; Park, Taesung; Gelenbe, Erol

    2014-01-01

    Gene Regulatory Networks (GRN) provide systematic views of complex living systems, offering reliable and large-scale GRNs to identify disease candidate genes. A reverse engineering technique, Bayesian Model Averaging-based Networks (BMAnet), which ensembles all appropriate linear models to tackle uncertainty in model selection that integrates heterogeneous biological data sets is introduced. Using network evaluation metrics, we compare the networks that are thus identified. The metric 'Random walk with restart (Rwr)' is utilised to search for disease genes. In a simulation our method shows better performance than elastic-net and Gaussian graphical models, but topological quantities vary among the three methods. Using real-data, brain tumour gene expression samples consisting of non-tumour, grade III and grade IV are analysed to estimate networks with a total of 4422 genes. Based on these networks, 169 brain tumour-related candidate genes were identified and some were found to relate to 'wound', 'apoptosis', and 'cell death' processes. PMID:25796737

  6. An efficient record linkage scheme using graphical analysis for identifier error detection

    Directory of Open Access Journals (Sweden)

    Peto Tim EA

    2011-02-01

    Full Text Available Abstract Background Integration of information on individuals (record linkage is a key problem in healthcare delivery, epidemiology, and "business intelligence" applications. It is now common to be required to link very large numbers of records, often containing various combinations of theoretically unique identifiers, such as NHS numbers, which are both incomplete and error-prone. Methods We describe a two-step record linkage algorithm in which identifiers with high cardinality are identified or generated, and used to perform an initial exact match based linkage. Subsequently, the resulting clusters are studied and, if appropriate, partitioned using a graph based algorithm detecting erroneous identifiers. Results The system was used to cluster over 250 million health records from five data sources within a large UK hospital group. Linkage, which was completed in about 30 minutes, yielded 3.6 million clusters of which about 99.8% contain, with high likelihood, records from one patient. Although computationally efficient, the algorithm's requirement for exact matching of at least one identifier of each record to another for cluster formation may be a limitation in some databases containing records of low identifier quality. Conclusions The technique described offers a simple, fast and highly efficient two-step method for large scale initial linkage for records commonly found in the UK's National Health Service.

  7. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  8. Identifying the Drivers and Occurrence of Historical and Future Extreme Air-quality Events in the United States Using Advanced Statistical Techniques

    Science.gov (United States)

    Porter, W. C.; Heald, C. L.; Cooley, D. S.; Russell, B. T.

    2013-12-01

    Episodes of air-quality extremes are known to be heavily influenced by meteorological conditions, but traditional statistical analysis techniques focused on means and standard deviations may not capture important relationships at the tails of these two respective distributions. Using quantile regression (QR) and extreme value theory (EVT), methodologies specifically developed to examine the behavior of heavy-tailed phenomena, we analyze extremes in the multi-decadal record of ozone (O3) and fine particulate matter (PM2.5) in the United States. We investigate observations from the Air Quality System (AQS) and Interagency Monitoring of Protected Visual Environments (IMPROVE) networks for connections to meteorological drivers, as provided by the National Center for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) product. Through regional characterization by quantile behavior and EVT modeling of the meteorological covariates most responsible for extreme levels of O3 and PM2.5, we estimate pollutant exceedance frequencies and uncertainties in the United States under current and projected future climates, highlighting those meteorological covariates and interactions whose influence on air-quality extremes differs most significantly from the behavior of the bulk of the distribution. As current policy may be influenced by air-quality projections, we then compare these estimated frequencies to those produced by NCAR's Community Earth System Model (CESM) identifying regions, covariates, and species whose extreme behavior may not be adequately captured by current models.

  9. Identifying Effective Spelling Interventions Using a Brief Experimental Analysis and Extended Analysis

    Science.gov (United States)

    McCurdy, Merilee; Clure, Lynne F.; Bleck, Amanda A.; Schmitz, Stephanie L.

    2016-01-01

    Spelling is an important skill that is crucial to effective written communication. In this study, brief experimental analysis procedures were used to examine spelling instruction strategies (e.g., whole word correction; word study strategy; positive practice; and cover, copy, and compare) for four students. In addition, an extended analysis was…

  10. Use of nuclear techniques for coal analysis in exploration, mining and processing

    International Nuclear Information System (INIS)

    Nuclear techniques have a long history of application in the coal industry, during exploration and especially during coal preparation, for the measurement of ash content. The preferred techniques are based on X- and gamma-ray scattering and borehole logging, and on-line equipment incorporating these techniques are now in world-wide routine use. However, gamma-ray techniques are mainly restricted to density measurement and X-ray techniques are principally used for ash determinations. They have a limited range and when used on-line some size reduction of the coal is usually required and a full elemental analysis is not possible. In particular, X- and gamma-ray techniques are insensitive to the principal elements in the combustible component and to many of the important elements in the mineral fraction. Neutron techniques on the other hand have a range which is compatible with on-line requirements and all elements in the combustible component and virtually all elements in the mineral component can be observed. A complete elemental analysis of coal then allows the ash content and the calorific value to be determined on-line. This paper surveys the various nuclear techniques now in use and gives particular attention to the present state of development of neutron methods and to their advantages and limitations. Although it is shown that considerable further development and operational experience are still required, equipment now being introduced has a performance which matches many of the identified requirements and an early improvement in specification can be anticipated

  11. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample. PMID:27030469

  12. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample.

  13. Identifying Skill Requirements for GIS Positions: A Content Analysis of Job Advertisements

    Science.gov (United States)

    Hong, Jung Eun

    2016-01-01

    This study identifies the skill requirements for geographic information system (GIS) positions, including GIS analysts, programmers/developers/engineers, specialists, and technicians, through a content analysis of 946 GIS job advertisements from 2007-2014. The results indicated that GIS job applicants need to possess high levels of GIS analysis…

  14. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    NARCIS (Netherlands)

    B.F. Voight (Benjamin); L.J. Scott (Laura); V. Steinthorsdottir (Valgerdur); A.D. Morris (Andrew); C. Dina (Christian); R.P. Welch (Ryan); E. Zeggini (Eleftheria); C. Huth (Cornelia); Y.S. Aulchenko (Yurii); G. Thorleifsson (Gudmar); L.J. McCulloch (Laura); T. Ferreira (Teresa); H. Grallert (Harald); N. Amin (Najaf); G. Wu (Guanming); C.J. Willer (Cristen); S. Raychaudhuri (Soumya); S.A. McCarroll (Steven); C. Langenberg (Claudia); O.M. Hofmann (Oliver); J. Dupuis (Josée); L. Qi (Lu); A.V. Segrè (Ayellet); M. van Hoek (Mandy); P. Navarro (Pau); K.G. Ardlie (Kristin); B. Balkau (Beverley); R. Benediktsson (Rafn); A.J. Bennett (Amanda); R. Blagieva (Roza); E. Boerwinkle (Eric); L.L. Bonnycastle (Lori); K.B. Boström (Kristina Bengtsson); B. Bravenboer (Bert); S. Bumpstead (Suzannah); N.P. Burtt (Noël); G. Charpentier (Guillaume); P.S. Chines (Peter); M. Cornelis (Marilyn); D.J. Couper (David); G. Crawford (Gabe); A.S.F. Doney (Alex); K.S. Elliott (Katherine); M.R. Erdos (Michael); C.S. Fox (Caroline); C.S. Franklin (Christopher); M. Ganser (Martha); C. Gieger (Christian); N. Grarup (Niels); T. Green (Todd); S. Griffin (Simon); C.J. Groves (Christopher); C. Guiducci (Candace); S. Hadjadj (Samy); N. Hassanali (Neelam); C. Herder (Christian); B. Isomaa (Bo); A.U. Jackson (Anne); P.R.V. Johnson (Paul); T. Jørgensen (Torben); W.H.L. Kao (Wen); N. Klopp (Norman); A. Kong (Augustine); P. Kraft (Peter); J. Kuusisto (Johanna); T. Lauritzen (Torsten); M. Li (Man); A. Lieverse (Aloysius); C.M. Lindgren (Cecilia); V. Lyssenko (Valeriya); M. Marre (Michel); T. Meitinger (Thomas); K. Midthjell (Kristian); M.A. Morken (Mario); N. Narisu (Narisu); P. Nilsson (Peter); K.R. Owen (Katharine); F. Payne (Felicity); J.R.B. Perry (John); A.K. Petersen; C. Platou (Carl); C. Proença (Christine); I. Prokopenko (Inga); W. Rathmann (Wolfgang); N.W. Rayner (Nigel William); N.R. Robertson (Neil); G. Rocheleau (Ghislain); M. Roden (Michael); M.J. Sampson (Michael); R. Saxena (Richa); B.M. Shields (Beverley); P. Shrader (Peter); G. Sigurdsson (Gunnar); T. Sparsø (Thomas); K. Strassburger (Klaus); H.M. Stringham (Heather); Q. Sun (Qi); A.J. Swift (Amy); B. Thorand (Barbara); J. Tichet (Jean); T. Tuomi (Tiinamaija); R.M. van Dam (Rob); T.W. van Haeften (Timon); T.W. van Herpt (Thijs); J.V. van Vliet-Ostaptchouk (Jana); G.B. Walters (Bragi); M.N. Weedon (Michael); C. Wijmenga (Cisca); J.C.M. Witteman (Jacqueline); R.N. Bergman (Richard); S. Cauchi (Stephane); F.S. Collins (Francis); A.L. Gloyn (Anna); U. Gyllensten (Ulf); T. Hansen (Torben); W.A. Hide (Winston); G.A. Hitman (Graham); A. Hofman (Albert); D. Hunter (David); K. Hveem (Kristian); M. Laakso (Markku); K.L. Mohlke (Karen); C.N.A. Palmer (Colin); P.P. Pramstaller (Peter Paul); I. Rudan (Igor); E.J.G. Sijbrands (Eric); L.D. Stein (Lincoln); J. Tuomilehto (Jaakko); A.G. Uitterlinden (André); M. Walker (Mark); N.J. Wareham (Nick); G.R. Abecasis (Gonçalo); B.O. Boehm (Bernhard); H. Campbell (Harry); M.J. Daly (Mark); A.T. Hattersley (Andrew); F.B. Hu (Frank); J.B. Meigs (James); J.S. Pankow (James); O. Pedersen (Oluf); H.E. Wichmann (Erich); I. Barroso (Inês); J.C. Florez (Jose); T.M. Frayling (Timothy); L. Groop (Leif); R. Sladek (Rob); U. Thorsteinsdottir (Unnur); J.F. Wilson (James); T. Illig (Thomas); P. Froguel (Philippe); P. Tikka-Kleemola (Päivi); J-A. Zwart (John-Anker); D. Altshuler (David); M. Boehnke (Michael); M.I. McCarthy (Mark); R.M. Watanabe (Richard)

    2010-01-01

    textabstractBy combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals w

  15. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    NARCIS (Netherlands)

    Voight, Benjamin F.; Scott, Laura J.; Steinthorsdottir, Valgerdur; Morris, Andrew P.; Dina, Christian; Welch, Ryan P.; Zeggini, Eleftheria; Huth, Cornelia; Aulchenko, Yurii S.; Thorleifsson, Gudmar; McCulloch, Laura J.; Ferreira, Teresa; Grallert, Harald; Amin, Najaf; Wu, Guanming; Willer, Cristen J.; Raychaudhuri, Soumya; McCarroll, Steve A.; Langenberg, Claudia; Hofmann, Oliver M.; Dupuis, Josee; Qi, Lu; Segre, Ayellet V.; van Hoek, Mandy; Navarro, Pau; Ardlie, Kristin; Balkau, Beverley; Benediktsson, Rafn; Bennett, Amanda J.; Blagieva, Roza; Boerwinkle, Eric; Bonnycastle, Lori L.; Bostrom, Kristina Bengtsson; Bravenboer, Bert; Bumpstead, Suzannah; Burtt, Noisel P.; Charpentier, Guillaume; Chines, Peter S.; Cornelis, Marilyn; Couper, David J.; Crawford, Gabe; Doney, Alex S. F.; Elliott, Katherine S.; Elliott, Amanda L.; Erdos, Michael R.; Fox, Caroline S.; Franklin, Christopher S.; Ganser, Martha; Gieger, Christian; Grarup, Niels; Green, Todd; Griffin, Simon; Groves, Christopher J.; Guiducci, Candace; Hadjadj, Samy; Hassanali, Neelam; Herder, Christian; Isomaa, Bo; Jackson, Anne U.; Johnson, Paul R. V.; Jorgensen, Torben; Kao, Wen H. L.; Klopp, Norman; Kong, Augustine; Kraft, Peter; Kuusisto, Johanna; Lauritzen, Torsten; Li, Man; Lieverse, Aloysius; Lindgren, Cecilia M.; Lyssenko, Valeriya; Marre, Michel; Meitinger, Thomas; Midthjell, Kristian; Morken, Mario A.; Narisu, Narisu; Nilsson, Peter; Owen, Katharine R.; Payne, Felicity; Perry, John R. B.; Petersen, Ann-Kristin; Platou, Carl; Proenca, Christine; Prokopenko, Inga; Rathmann, Wolfgang; Rayner, N. William; Robertson, Neil R.; Rocheleau, Ghislain; Roden, Michael; Sampson, Michael J.; Saxena, Richa; Shields, Beverley M.; Shrader, Peter; Sigurdsson, Gunnar; Sparso, Thomas; Strassburger, Klaus; Stringham, Heather M.; Sun, Qi; Swift, Amy J.; Thorand, Barbara; Tichet, Jean; Tuomi, Tiinamaija; van Dam, Rob M.; van Haeften, Timon W.; van Herpt, Thijs; van Vliet-Ostaptchouk, Jana V.; Walters, G. Bragi; Weedon, Michael N.; Wijmenga, Cisca; Witteman, Jacqueline; Bergman, Richard N.; Cauchi, Stephane; Collins, Francis S.; Gloyn, Anna L.; Gyllensten, Ulf; Hansen, Torben; Hide, Winston A.; Hitman, Graham A.; Hofman, Albert; Hunter, David J.; Hveem, Kristian; Laakso, Markku; Mohlke, Karen L.; Morris, Andrew D.; Palmer, Colin N. A.; Pramstaller, Peter P.; Rudan, Igor; Sijbrands, Eric; Stein, Lincoln D.; Tuomilehto, Jaakko; Uitterlinden, Andre; Walker, Mark; Wareham, Nicholas J.; Watanabe, Richard M.; Abecasis, Goncalo R.; Boehm, Bernhard O.; Campbell, Harry; Daly, Mark J.; Hattersley, Andrew T.; Hu, Frank B.; Meigs, James B.; Pankow, James S.; Pedersen, Oluf; Wichmann, H-Erich; Barroso, Ines; Florez, Jose C.; Frayling, Timothy M.; Groop, Leif; Sladek, Rob; Thorsteinsdottir, Unnur; Wilson, James F.; Illig, Thomas; Froguel, Philippe; van Duijn, Cornelia M.; Stefansson, Kari; Altshuler, David; Boehnke, Michael; McCarthy, Mark I.

    2010-01-01

    By combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals with combined

  16. Genome-wide association study meta-analysis identifies seven new rheumatoid arthritis risk loci

    NARCIS (Netherlands)

    Stahl, Eli A.; Raychaudhuri, Soumya; Remmers, Elaine F.; Xie, Gang; Eyre, Stephen; Thomson, Brian P.; Li, Yonghong; Kurreeman, Fina A. S.; Zhernakova, Alexandra; Hinks, Anne; Guiducci, Candace; Chen, Robert; Alfredsson, Lars; Amos, Christopher I.; Ardlie, Kristin G.; Barton, Anne; Bowes, John; Brouwer, Elisabeth; Burtt, Noel P.; Catanese, Joseph J.; Coblyn, Jonathan; Coenen, Marieke J. H.; Costenbader, Karen H.; Criswell, Lindsey A.; Crusius, J. Bart A.; Cui, Jing; de Bakker, Paul I. W.; De Jager, Philip L.; Ding, Bo; Emery, Paul; Flynn, Edward; Harrison, Pille; Hocking, Lynne J.; Huizinga, Tom W. J.; Kastner, Daniel L.; Ke, Xiayi; Lee, Annette T.; Liu, Xiangdong; Martin, Paul; Morgan, Ann W.; Padyukov, Leonid; Posthumus, Marcel D.; Radstake, Timothy R. D. J.; Reid, David M.; Seielstad, Mark; Seldin, Michael F.; Shadick, Nancy A.; Steer, Sophia; Tak, Paul P.; Thomson, Wendy; van der Helm-van Mil, Annette H. M.; van der Horst-Bruinsma, Irene E.; van der Schoot, C. Ellen; van Riel, Piet L. C. M.; Weinblatt, Michael E.; Wilson, Anthony G.; Wolbink, Gert Jan; Wordsworth, B. Paul; Wijmenga, Cisca; Karlson, Elizabeth W.; Toes, Rene E. M.; de Vries, Niek; Begovich, Ann B.; Worthington, Jane; Siminovitch, Katherine A.; Gregersen, Peter K.; Klareskog, Lars; Plenge, Robert M.

    2010-01-01

    To identify new genetic risk factors for rheumatoid arthritis, we conducted a genome-wide association study meta-analysis of 5,539 autoantibody-positive individuals with rheumatoid arthritis (cases) and 20,169 controls of European descent, followed by replication in an independent set of 6,768 rheum

  17. Bioinformatics analysis identifies several intrinsically disordered human E3 ubiquitin-protein ligases

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Nielsen, Sofie V; Lindorff-Larsen, Kresten;

    2016-01-01

    conduct a bioinformatics analysis to examine >600 human and S. cerevisiae E3 ligases to identify enzymes that are similar to San1 in terms of function and/or mechanism of substrate recognition. An initial sequence-based database search was found to detect candidates primarily based on the homology...

  18. Clinical Trial Registries Are of Minimal Use for Identifying Selective Outcome and Analysis Reporting

    Science.gov (United States)

    Norris, Susan L.; Holmer, Haley K.; Fu, Rongwei; Ogden, Lauren A.; Viswanathan, Meera S.; Abou-Setta, Ahmed M.

    2014-01-01

    Objective: This study aimed to examine selective outcome reporting (SOR) and selective analysis reporting (SAR) in randomized controlled trials (RCTs) and to explore the usefulness of trial registries for identifying SOR and SAR. Study Design and Setting: We selected one "index outcome" for each of three comparative effectiveness reviews…

  19. Genome-wide association scan meta-analysis identifies three loci influencing adiposity and fat distribution

    NARCIS (Netherlands)

    C.M. Lindgren (Cecilia); I.M. Heid (Iris); J.C. Randall (Joshua); C. Lamina (Claudia); V. Steinthorsdottir (Valgerdur); L. Qi (Lu); E.K. Speliotes (Elizabeth); G. Thorleifsson (Gudmar); C.J. Willer (Cristen); B.M. Herrera (Blanca); A.U. Jackson (Anne); N. Lim (Noha); P. Scheet (Paul); N. Soranzo (Nicole); N. Amin (Najaf); Y.S. Aulchenko (Yurii); J.C. Chambers (John); A. Drong (Alexander); J. Luan; H.N. Lyon (Helen); F. Rivadeneira Ramirez (Fernando); S. Sanna (Serena); N. Timpson (Nicholas); M.C. Zillikens (Carola); H.Z. Jing; P. Almgren (Peter); S. Bandinelli (Stefania); A.J. Bennett (Amanda); R.N. Bergman (Richard); L.L. Bonnycastle (Lori); S. Bumpstead (Suzannah); S.J. Chanock (Stephen); L. Cherkas (Lynn); P.S. Chines (Peter); L. Coin (Lachlan); C. Cooper (Charles); G. Crawford (Gabe); A. Doering (Angela); A. Dominiczak (Anna); A.S.F. Doney (Alex); S. Ebrahim (Shanil); P. Elliott (Paul); M.R. Erdos (Michael); K. Estrada Gil (Karol); L. Ferrucci (Luigi); G. Fischer (Guido); N.G. Forouhi (Nita); C. Gieger (Christian); H. Grallert (Harald); C.J. Groves (Christopher); S.M. Grundy (Scott); C. Guiducci (Candace); D. Hadley (David); A. Hamsten (Anders); A.S. Havulinna (Aki); A. Hofman (Albert); R. Holle (Rolf); J.W. Holloway (John); T. Illig (Thomas); B. Isomaa (Bo); L.C. Jacobs (Leonie); K. Jameson (Karen); P. Jousilahti (Pekka); F. Karpe (Fredrik); J. Kuusisto (Johanna); J. Laitinen (Jaana); G.M. Lathrop (Mark); D.A. Lawlor (Debbie); M. Mangino (Massimo); W.L. McArdle (Wendy); T. Meitinger (Thomas); M.A. Morken (Mario); A.P. Morris (Andrew); P. Munroe (Patricia); N. Narisu (Narisu); A. Nordström (Anna); B.A. Oostra (Ben); C.N.A. Palmer (Colin); F. Payne (Felicity); J. Peden (John); I. Prokopenko (Inga); F. Renström (Frida); A. Ruokonen (Aimo); V. Salomaa (Veikko); M.S. Sandhu (Manjinder); L.J. Scott (Laura); A. Scuteri (Angelo); K. Silander (Kaisa); K. Song (Kijoung); X. Yuan (Xin); H.M. Stringham (Heather); A.J. Swift (Amy); T. Tuomi (Tiinamaija); M. Uda (Manuela); P. Vollenweider (Peter); G. Waeber (Gérard); C. Wallace (Chris); G.B. Walters (Bragi); M.N. Weedon (Michael); J.C.M. Witteman (Jacqueline); C. Zhang (Cuilin); M. Caulfield (Mark); F.S. Collins (Francis); G.D. Smith; I.N.M. Day (Ian); P.W. Franks (Paul); A.T. Hattersley (Andrew); F.B. Hu (Frank); M.R. Jarvelin; A. Kong (Augustine); J.S. Kooner (Jaspal); M. Laakso (Markku); E. Lakatta (Edward); V. Mooser (Vincent); L. Peltonen (Leena Johanna); N.J. Samani (Nilesh); T.D. Spector (Timothy); D.P. Strachan (David); T. Tanaka (Toshiko); J. Tuomilehto (Jaakko); A.G. Uitterlinden (André); P. Tikka-Kleemola (Päivi); N.J. Wareham (Nick); H. Watkins (Hugh); D. Waterworth (Dawn); M. Boehnke (Michael); P. Deloukas (Panagiotis); L. Groop (Leif); D.J. Hunter (David); U. Thorsteinsdottir (Unnur); D. Schlessinger (David); H.E. Wichmann (Erich); T.M. Frayling (Timothy); G.R. Abecasis (Gonçalo); J.N. Hirschhorn (Joel); R.J.F. Loos (Ruth); J-A. Zwart (John-Anker); K.L. Mohlke (Karen); I. Barroso (Inês); M.I. McCarthy (Mark)

    2009-01-01

    textabstractTo identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist-hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evid

  20. Using Latent Class Analysis to Identify Academic and Behavioral Risk Status in Elementary Students

    Science.gov (United States)

    King, Kathleen R.; Lembke, Erica S.; Reinke, Wendy M.

    2016-01-01

    Identifying classes of children on the basis of academic and behavior risk may have important implications for the allocation of intervention resources within Response to Intervention (RTI) and Multi-Tiered System of Support (MTSS) models. Latent class analysis (LCA) was conducted with a sample of 517 third grade students. Fall screening scores in…

  1. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    Science.gov (United States)

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  2. Identifying sustainability issues using participatory SWOT analysis - A case study of egg production in the Netherlands

    NARCIS (Netherlands)

    Mollenhorst, H.; Boer, de I.J.M.

    2004-01-01

    The aim of this paper was to demonstrate how participatory strengths, weaknesses, opportunities and threats (SWOT) analysis can be used to identify relevant economic, ecological and societal (EES) issues for the assessment of sustainable development. This is illustrated by the case of egg production

  3. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    DEFF Research Database (Denmark)

    Voight, Benjamin F; Scott, Laura J; Steinthorsdottir, Valgerdur;

    2010-01-01

    By combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals with combi...

  4. Identifying Contingency Requirements using Obstacle Analysis on an Unpiloted Aerial Vehicle

    Science.gov (United States)

    Lutz, Robyn R.; Nelson, Stacy; Patterson-Hine, Ann; Frost, Chad R.; Tal, Doron

    2005-01-01

    This paper describes experience using Obstacle Analysis to identify contingency requirements on an unpiloted aerial vehicle. A contingency is an operational anomaly, and may or may not involve component failure. The challenges to this effort were: ( I ) rapid evolution of the system while operational, (2) incremental autonomy as capabilities were transferred from ground control to software control and (3) the eventual safety-criticality of such systems as they begin to fly over populated areas. The results reported here are preliminary but show that Obstacle Analysis helped (1) identify new contingencies that appeared as autonomy increased; (2) identify new alternatives for handling both previously known and new contingencies; and (3) investigate the continued validity of existing software requirements for contingency handling. Since many mobile, intelligent systems are built using a development process that poses the same challenges, the results appear to have applicability to other similar systems.

  5. Gene expression meta-analysis identifies chromosomal regions involved in ovarian cancer survival

    DEFF Research Database (Denmark)

    Thomassen, Mads; Jochumsen, Kirsten M; Mogensen, Ole;

    2009-01-01

    the relation of gene expression and chromosomal position to identify chromosomal regions of importance for early recurrence of ovarian cancer. By use of *Gene Set Enrichment Analysis*, we have ranked chromosomal regions according to their association to survival. Over-representation analysis including 1......Ovarian cancer cells exhibit complex karyotypic alterations causing deregulation of numerous genes. Some of these genes are probably causal for cancer formation and local growth, whereas others are causal for metastasis and recurrence. By using publicly available data sets, we have investigated......-4 consecutive cytogenetic bands identified regions with increased expression for chromosome 5q12-14, and a very large region of chromosome 7 with the strongest signal at 7p15-13 among tumors from short-living patients. Reduced gene expression was identified at 4q26-32, 6p12-q15, 9p21-q32, and 11p14-11. We...

  6. Application of different techniques to identify the effects of irradiation on Brazilian beans after six months storage

    Science.gov (United States)

    Villavicencio, A. L. C. H.; Mancini-Filho, J.; Delincée, H.

    1998-06-01

    Four different techniques to detect the effect of irradiation in beans were investigated. Two types of Brazilian beans, Phaseolus vulgaris L., var. carioca and Vigna unguiculata (L.) Walp, var. macaçar, were irradiated using a 60Co source with doses ranging from 10.0 kGy. After 6 months storage at ambient temperature the detection tests were carried out. Firstly, germination tests showed markedly reduced root growth and almost totally retarded shoot elongation of irradiated beans as compared to non-irradiated beans. Secondly, DNA fragmentation was studied using a microgel electrophoresis. Irradiated cells produced typical comets with DNA fragments migrating towards the anode. DNA of non-irradiated cells exhibited a limited migration. Thirdly, electron spin resonance for detection of cellulose radicals was tested, since it was expected that these free radicals are quite stable in solid and dry foods. However, only in beans irradiated with 10 kGy a small signal could be detected. Fourtly, thermoluminescence, a method to analyze mineral debris adhering to food, turned out to be a good choice to detect irradiation effects in beans, even after 6 months of storage. The results indicate that three of these four techniques proposed, can be used to detect the effect of irradiation in these two varieties of Brazilian beans at a dose level useful for insect disinfestation (1 kGy).

  7. Application of different techniques to identify the effects of irradiation on Brazilian beans after six months storage

    Energy Technology Data Exchange (ETDEWEB)

    Villavicencio, A.L.C.H.; Mancini-Filho, J.; Delincee, H

    1998-06-01

    Four different techniques to detect the effect of irradiation in beans were investigated. Two types of Brazilian beans, Phaseolus vulgaris L., var. carioca and Vigna unguiculata (L.) Walp, var. macacar, were irradiated using a {sup 60}Co source with doses ranging from 0, 1.0 to 10.0 kGy. After 6 months storage at ambient temperature the detection tests were carried out. Firstly, germination tests showed markedly reduced root growth and almost totally retarded shoot elongation of irradiated beans as compared to non-irradiated beans. Secondly, DNA fragmentation was studied using a microgel electrophoresis. Irradiated cells produced typical comets with DNA fragments migrating towards the anode. DNA of non-irradiated cells exhibited a limited migration. Thirdly, electron spin resonance for detection of cellulose radicals was tested, since it was expected that these free radicals are quite stable in solid and dry foods. However, only in beans irradiated with 10 kGy a small signal could be detected. Fourthly, thermoluminescence, a method to analyze mineral debris adhering to food, turned out to be a good choice to detect irradiation effects in beans, even after 6 months of storage. The results indicate that three of these four techniques proposed, can be used to detect the effect of irradiation in these two varieties of Brazilian beans at a dose level useful for insect disinfestation (1 kGy)

  8. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  9. Metabolites production improvement by identifying minimal genomes and essential genes using flux balance analysis.

    Science.gov (United States)

    Salleh, Abdul Hakim Mohamed; Mohamad, Mohd Saberi; Deris, Safaai; Illias, Rosli Md

    2015-01-01

    With the advancement in metabolic engineering technologies, reconstruction of the genome of host organisms to achieve desired phenotypes can be made. However, due to the complexity and size of the genome scale metabolic network, significant components tend to be invisible. We proposed an approach to improve metabolite production that consists of two steps. First, we find the essential genes and identify the minimal genome by a single gene deletion process using Flux Balance Analysis (FBA) and second by identifying the significant pathway for the metabolite production using gene expression data. A genome scale model of Saccharomyces cerevisiae for production of vanillin and acetate is used to test this approach. The result has shown the reliability of this approach to find essential genes, reduce genome size and identify production pathway that can further optimise the production yield. The identified genes and pathways can be extendable to other applications especially in strain optimisation. PMID:26489144

  10. Gene expression meta-analysis identifies metastatic pathways and transcription factors in breast cancer

    International Nuclear Information System (INIS)

    Metastasis is believed to progress in several steps including different pathways but the determination and understanding of these mechanisms is still fragmentary. Microarray analysis of gene expression patterns in breast tumors has been used to predict outcome in recent studies. Besides classification of outcome, these global expression patterns may reflect biological mechanisms involved in metastasis of breast cancer. Our purpose has been to investigate pathways and transcription factors involved in metastasis by use of gene expression data sets. We have analyzed 8 publicly available gene expression data sets. A global approach, 'gene set enrichment analysis' as well as an approach focusing on a subset of significantly differently regulated genes, GenMAPP, has been applied to rank pathway gene sets according to differential regulation in metastasizing tumors compared to non-metastasizing tumors. Meta-analysis has been used to determine overrepresentation of pathways and transcription factors targets, concordant deregulated in metastasizing breast tumors, in several data sets. The major findings are up-regulation of cell cycle pathways and a metabolic shift towards glucose metabolism reflected in several pathways in metastasizing tumors. Growth factor pathways seem to play dual roles; EGF and PDGF pathways are decreased, while VEGF and sex-hormone pathways are increased in tumors that metastasize. Furthermore, migration, proteasome, immune system, angiogenesis, DNA repair and several signal transduction pathways are associated to metastasis. Finally several transcription factors e.g. E2F, NFY, and YY1 are identified as being involved in metastasis. By pathway meta-analysis many biological mechanisms beyond major characteristics such as proliferation are identified. Transcription factor analysis identifies a number of key factors that support central pathways. Several previously proposed treatment targets are identified and several new pathways that may

  11. A cross-species genetic analysis identifies candidate genes for mouse anxiety and human bipolar disorder.

    Science.gov (United States)

    Ashbrook, David G; Williams, Robert W; Lu, Lu; Hager, Reinmar

    2015-01-01

    Bipolar disorder (BD) is a significant neuropsychiatric disorder with a lifetime prevalence of ~1%. To identify genetic variants underlying BD genome-wide association studies (GWAS) have been carried out. While many variants of small effect associated with BD have been identified few have yet been confirmed, partly because of the low power of GWAS due to multiple comparisons being made. Complementary mapping studies using murine models have identified genetic variants for behavioral traits linked to BD, often with high power, but these identified regions often contain too many genes for clear identification of candidate genes. In the current study we have aligned human BD GWAS results and mouse linkage studies to help define and evaluate candidate genes linked to BD, seeking to use the power of the mouse mapping with the precision of GWAS. We use quantitative trait mapping for open field test and elevated zero maze data in the largest mammalian model system, the BXD recombinant inbred mouse population, to identify genomic regions associated with these BD-like phenotypes. We then investigate these regions in whole genome data from the Psychiatric Genomics Consortium's bipolar disorder GWAS to identify candidate genes associated with BD. Finally we establish the biological relevance and pathways of these genes in a comprehensive systems genetics analysis. We identify four genes associated with both mouse anxiety and human BD. While TNR is a novel candidate for BD, we can confirm previously suggested associations with CMYA5, MCTP1, and RXRG. A cross-species, systems genetics analysis shows that MCTP1, RXRG, and TNR coexpress with genes linked to psychiatric disorders and identify the striatum as a potential site of action. CMYA5, MCTP1, RXRG, and TNR are associated with mouse anxiety and human BD. We hypothesize that MCTP1, RXRG, and TNR influence intercellular signaling in the striatum. PMID:26190982

  12. A cross-species genetic analysis identifies candidate genes for mouse anxiety and human bipolar disorder

    Directory of Open Access Journals (Sweden)

    David G Ashbrook

    2015-07-01

    Full Text Available Bipolar disorder (BD is a significant neuropsychiatric disorder with a lifetime prevalence of ~1%. To identify genetic variants underlying BD genome-wide association studies (GWAS have been carried out. While many variants of small effect associated with BD have been identified few have yet been confirmed, partly because of the low power of GWAS due to multiple comparisons being made. Complementary mapping studies using murine models have identified genetic variants for behavioral traits linked to BD, often with high power, but these identified regions often contain too many genes for clear identification of candidate genes. In the current study we have aligned human BD GWAS results and mouse linkage studies to help define and evaluate candidate genes linked to BD, seeking to use the power of the mouse mapping with the precision of GWAS. We use quantitative trait mapping for open field test and elevated zero maze data in the largest mammalian model system, the BXD recombinant inbred mouse population, to identify genomic regions associated with these BD-like phenotypes. We then investigate these regions in whole genome data from the Psychiatric Genomics Consortium’s bipolar disorder GWAS to identify candidate genes associated with BD. Finally we establish the biological relevance and pathways of these genes in a comprehensive systems genetics analysis.We identify four genes associated with both mouse anxiety and human BD. While TNR is a novel candidate for BD, we can confirm previously suggested associations with CMYA5, MCTP1 and RXRG. A cross-species, systems genetics analysis shows that MCTP1, RXRG and TNR coexpress with genes linked to psychiatric disorders and identify the striatum as a potential site of action. CMYA5, MCTP1, RXRG and TNR are associated with mouse anxiety and human BD. We hypothesize that MCTP1, RXRG and TNR influence intercellular signaling in the striatum.

  13. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    Directory of Open Access Journals (Sweden)

    Mahmoud I. Al-Kadi

    2013-05-01

    Full Text Available Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  14. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    Science.gov (United States)

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  15. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  16. Image analysis technique applied to lock-exchange gravity currents

    Science.gov (United States)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  17. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  18. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  19. Radial Velocity Data Analysis with Compressed Sensing Techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  20. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  1. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  2. Expert rowers’ motion analysis for synthesis and technique digitalization

    Directory of Open Access Journals (Sweden)

    Filippeschi Alessandro

    2011-12-01

    Full Text Available Four expert rowers’ gestures were gathered on the SPRINT rowing platform with the aid of an optic motion tracking system. Data were analyzed in order to get a digital representation of the features involved in rowing. Moreover, these data provide a dataset for developing digital models for rowing motion synthesis. Rowers were modeled as kinematic chains, data were processed in order to get position and orientation of upper body limbs. This representation was combined with SPRINT data in order to evaluate features found in the literature, to find new ones and to build models for the generation of rowing motion. The analysis shows the effectiveness of the motion reconstruction and two examples of technique features: stroke timing and upper limbs orientation during the finish phase.

  3. Identifying Critical Factors of Sale Failure on Commercial Property Types, Shop Houses by Using Multi Attribute Variable Technique

    OpenAIRE

    N.I. Mohamad; N. M. Tawil; I.M. Usman; M. M. Tahir

    2014-01-01

    The focus of this research is to identify the critical factors of shop houses sale failure in Bandar Baru Nilai and further up to discover the critical factors of sale failure of commercial property types, shop houses in new township as report by valuation and Property services department (JPPH) showed 5,931 units of shop houses in Malaysia is currently completed but remained unsold where Johor was recorded as the highest with unsold units followed by Negeri Sembilan. Bandar Baru Nilai (a dis...

  4. Evaluation of geophysical techniques for identifying fractures in program wells in Deaf Smith County, Texas: Revision 1, Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, R.P.; Siminitz, P.C.

    1987-08-01

    Quantitative information about the presence and orientation of fractures is essential for the understanding of the geomechanical and geohydrological behavior of rocks. This report evaluates various borehole geophysical techniques for characterizing fractures in three Civilian Radioactive Waste Management (CRWM) Program test wells in the Palo Duro Basin in Deaf Smith County, Texas. Emphasis has been placed on the Schlumberger Fracture Identification Log (FIL) which detects vertical fractures and provides data for calculation of orientation. Depths of FIL anomalies were compared to available core. It was found that the application of FIL results to characterize fracture frequency or orientation is inappropriate at this time. The uncertainties associated with the FIL information render the information unreliable. No geophysical logging tool appears to unequivocally determine the location and orientation of fractures in a borehole. Geologic mapping of the exploratory shafts will ultimately provide the best data on fracture frequency and orientation at the proposed repository site. 22 refs., 6 figs., 3 tabs.

  5. A portable system for identifying urinary tract infection in primary care using a PC-based chromatic technique

    International Nuclear Information System (INIS)

    An approach is described for monitoring urine samples using a portable system based on chromatic techniques and for predicting urinary tract infection (UTI) from the results. The system uses a webcam–computer combination with the screen of a computer visual display unit as a tuneable illumination source. It is shown that the system can operate in a robust manner under ambient lighting conditions and with potential for use as a point of care test in primary care. The present approach combines information on urine liquid concentration and turbidity. Its performance in an exploratory study is compared with microbiological culture of 200 urine samples, of which 79 had bacterial growth >105 colony forming unit/millilitre (cfu ml−1) indicative of UTI. It is shown that both sensitivity and negative predictive value of 0.92 could be achieved. (paper)

  6. Evaluation of geophysical techniques for identifying fractures in program wells in Deaf Smith County, Texas: Revision 1, Topical report

    International Nuclear Information System (INIS)

    Quantitative information about the presence and orientation of fractures is essential for the understanding of the geomechanical and geohydrological behavior of rocks. This report evaluates various borehole geophysical techniques for characterizing fractures in three Civilian Radioactive Waste Management (CRWM) Program test wells in the Palo Duro Basin in Deaf Smith County, Texas. Emphasis has been placed on the Schlumberger Fracture Identification Log (FIL) which detects vertical fractures and provides data for calculation of orientation. Depths of FIL anomalies were compared to available core. It was found that the application of FIL results to characterize fracture frequency or orientation is inappropriate at this time. The uncertainties associated with the FIL information render the information unreliable. No geophysical logging tool appears to unequivocally determine the location and orientation of fractures in a borehole. Geologic mapping of the exploratory shafts will ultimately provide the best data on fracture frequency and orientation at the proposed repository site. 22 refs., 6 figs., 3 tabs

  7. Identifying Critical Factors of Sale Failure on Commercial Property Types, Shop Houses by Using Multi Attribute Variable Technique

    Directory of Open Access Journals (Sweden)

    N.I. Mohamad

    2014-04-01

    Full Text Available The focus of this research is to identify the critical factors of shop houses sale failure in Bandar Baru Nilai and further up to discover the critical factors of sale failure of commercial property types, shop houses in new township as report by valuation and Property services department (JPPH showed 5,931 units of shop houses in Malaysia is currently completed but remained unsold where Johor was recorded as the highest with unsold units followed by Negeri Sembilan. Bandar Baru Nilai (a district of Negeri Sembilan is chosen as research sample for unsold shop houses units due to its strategic location which is near to KLIA, International Sepang Circuit, educational instituitions and surrounded by housing scheme but yet still has numbers of unsold units. Data of the research is obtained from literature review and survey question between developers, local authority, purchasers/tenant and local residents. Relative Importance Index (RII method is applied in identifying the critical factor of shop houses sale failure. Generally, the factors of sale failure are economy, demography, politic, location and access, public and basic facilities, financial loan, physical of product, current stock of shop houses upon completion, future potential of subsale and rental, developer’s background, promotion and marketing, speculation and time.

  8. Cluster analysis of spontaneous preterm birth phenotypes identifies potential associations among preterm birth mechanisms

    Science.gov (United States)

    Esplin, M Sean; Manuck, Tracy A.; Varner, Michael W.; Christensen, Bryce; Biggio, Joseph; Bukowski, Radek; Parry, Samuel; Zhang, Heping; Huang, Hao; Andrews, William; Saade, George; Sadovsky, Yoel; Reddy, Uma M.; Ilekis, John

    2015-01-01

    Objective We sought to employ an innovative tool based on common biological pathways to identify specific phenotypes among women with spontaneous preterm birth (SPTB), in order to enhance investigators' ability to identify to highlight common mechanisms and underlying genetic factors responsible for SPTB. Study Design A secondary analysis of a prospective case-control multicenter study of SPTB. All cases delivered a preterm singleton at SPTB ≤34.0 weeks gestation. Each woman was assessed for the presence of underlying SPTB etiologies. A hierarchical cluster analysis was used to identify groups of women with homogeneous phenotypic profiles. One of the phenotypic clusters was selected for candidate gene association analysis using VEGAS software. Results 1028 women with SPTB were assigned phenotypes. Hierarchical clustering of the phenotypes revealed five major clusters. Cluster 1 (N=445) was characterized by maternal stress, cluster 2 (N=294) by premature membrane rupture, cluster 3 (N=120) by familial factors, and cluster 4 (N=63) by maternal comorbidities. Cluster 5 (N=106) was multifactorial, characterized by infection (INF), decidual hemorrhage (DH) and placental dysfunction (PD). These three phenotypes were highly correlated by Chi-square analysis [PD and DH (p<2.2e-6); PD and INF (p=6.2e-10); INF and DH (p=0.0036)]. Gene-based testing identified the INS (insulin) gene as significantly associated with cluster 3 of SPTB. Conclusion We identified 5 major clusters of SPTB based on a phenotype tool and hierarchal clustering. There was significant correlation between several of the phenotypes. The INS gene was associated with familial factors underlying SPTB. PMID:26070700

  9. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Shahid Ali

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  10. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  11. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  12. Probabilistic approach to identify sensitive parameter distributions in multimedia pathway analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kamboj, S.; Gnanapragasam, E.; LePoire, D.; Biwer, B. M.; Cheng, J.; Arnish, J.; Yu, C.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Environmental Assessment; NRC

    2002-01-01

    Sensitive parameter distributions were identified with the use of probabilistic analysis in the RESRAD computer code. RESRAD is a multimedia pathway analysis code designed to evaluate radiological exposures resulting from radiological contamination in soil. The dose distribution was obtained by using a set of default parameter distribution/values. Most of the variations in the output dose distribution could be attributed to uncertainty in a small set of input parameters that could be considered as sensitive parameter distributions. The identification of the sensitive parameters is a first step in the prioritization of future research and information gathering. When site-specific parameter distribution/values are available for an actual site, the same process should be used with these site-specific data. Regression analysis used to identify sensitive parameters indicated that the dominant pathways depended on the radionuclide and source configurations. However, two parameter distributions were sensitive for many radionuclides: the external shielding factor when external exposure was the dominant pathway and the plant transfer factor when plant ingestion was the dominant pathway. No single correlation or regression coefficient can be used alone to identify sensitive parameters in all the cases. The coefficients are useful guides, but they have to be used in conjunction with other aids, such as scatter plots, and should undergo further analysis.

  13. Progress in identifying a human ionizing-radiation repair gene using DNA-mediated gene transfer techniques

    International Nuclear Information System (INIS)

    The authors employing DNA-mediated gene transfer techniques in introducing human DNA into a DNA double-strand break (DSB) repair deficient Chinese hamster (CHO) cell mutant (xrs-6), which is hypersensitive to both X-rays (D0 = 0.39 Gy) and the antibiotic bleomycin (D0 = 0.01 μg/ml). High molecular weight DNA isolated from cultured human skin fibroblasts was partially digested with restriction enzyme Sau 3A to average sizes of 20 or 40 Kb, ligated with plasmid pSV2-gpt DNA, and transfected into xrs-6 cells. Colonies which developed under a bleomycin and MAX (mycophenolic acid/adenine/xanthine) double-selection procedure were isolated and further tested for X-ray sensitivity and DSB rejoining capacity. To date a total of six X-ray or bleomycin resistant transformants have been isolated. All express rejoining capacity for X-ray-induced DSB, similar to the rate observed for DSB repair in CHO wild type cells. DNA isolated from these primary transformants contain various copy numbers of pSV2-gpt DNA and also contain human DNA sequences as determined by Southern blot hybridization. Recently, a secondary transformant has been isolated using DNA from one of the primary transformants. Cellular and molecular characterization of this transformant is in progress. DNA from a genuine secondary transformant will be used in the construction of a DNA library to isolate human genomic DNA encoding this radiation repair gene

  14. Efficient geometric rectification techniques for spectral analysis algorithm

    Science.gov (United States)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  15. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  16. Sensitivity-analysis techniques: self-teaching curriculum

    Energy Technology Data Exchange (ETDEWEB)

    Iman, R.L.; Conover, W.J.

    1982-06-01

    This self teaching curriculum on sensitivity analysis techniques consists of three parts: (1) Use of the Latin Hypercube Sampling Program (Iman, Davenport and Ziegler, Latin Hypercube Sampling (Program User's Guide), SAND79-1473, January 1980); (2) Use of the Stepwise Regression Program (Iman, et al., Stepwise Regression with PRESS and Rank Regression (Program User's Guide) SAND79-1472, January 1980); and (3) Application of the procedures to sensitivity and uncertainty analyses of the groundwater transport model MWFT/DVM (Campbell, Iman and Reeves, Risk Methodology for Geologic Disposal of Radioactive Waste - Transport Model Sensitivity Analysis; SAND80-0644, NUREG/CR-1377, June 1980: Campbell, Longsine, and Reeves, The Distributed Velocity Method of Solving the Convective-Dispersion Equation, SAND80-0717, NUREG/CR-1376, July 1980). This curriculum is one in a series developed by Sandia National Laboratories for transfer of the capability to use the technology developed under the NRC funded High Level Waste Methodology Development Program.

  17. Combined Analysis of SNP Array Data Identifies Novel CNV Candidates and Pathways in Ependymoma and Mesothelioma

    Directory of Open Access Journals (Sweden)

    Gabriel Wajnberg

    2015-01-01

    Full Text Available Copy number variation is a class of structural genomic modifications that includes the gain and loss of a specific genomic region, which may include an entire gene. Many studies have used low-resolution techniques to identify regions that are frequently lost or amplified in cancer. Usually, researchers choose to use proprietary or non-open-source software to detect these regions because the graphical interface tends to be easier to use. In this study, we combined two different open-source packages into an innovative strategy to identify novel copy number variations and pathways associated with cancer. We used a mesothelioma and ependymoma published datasets to assess our tool. We detected previously described and novel copy number variations that are associated with cancer chemotherapy resistance. We also identified altered pathways associated with these diseases, like cell adhesion in patients with mesothelioma and negative regulation of glutamatergic synaptic transmission in ependymoma patients. In conclusion, we present a novel strategy using open-source software to identify copy number variations and altered pathways associated with cancer.

  18. Enhanced Analysis Techniques for an Imaging Neutron and Gamma Ray Spectrometer

    Science.gov (United States)

    Madden, Amanda C.

    The presence of gamma rays and neutrons is a strong indicator of the presence of Special Nuclear Material (SNM). The imaging Neutron and gamma ray SPECTrometer (NSPECT) developed by the University of New Hampshire and Michigan Aerospace corporation detects the fast neutrons and prompt gamma rays from fissile material, and the gamma rays from radioactive material. The instrument operates as a double scatter device, requiring a neutron or a gamma ray to interact twice in the instrument. While this detection requirement decreases the efficiency of the instrument, it offers superior background rejection and the ability to measure the energy and momentum of the incident particle. These measurements create energy spectra and images of the emitting source for source identification and localization. The dual species instrument provides superior detection than a single species alone. In realistic detection scenarios, few particles are detected from a potential threat due to source shielding, detection at a distance, high background, and weak sources. This contributes to a small signal to noise ratio, and threat detection becomes difficult. To address these difficulties, several enhanced data analysis tools were developed. A Receiver Operating Characteristic Curve (ROC) helps set instrumental alarm thresholds as well as to identify the presence of a source. Analysis of a dual-species ROC curve provides superior detection capabilities. Bayesian analysis helps to detect and identify the presence of a source through model comparisons, and helps create a background corrected count spectra for enhanced spectroscopy. Development of an instrument response using simulations and numerical analyses will help perform spectra and image deconvolution. This thesis will outline the principles of operation of the NSPECT instrument using the double scatter technology, traditional analysis techniques, and enhanced analysis techniques as applied to data from the NSPECT instrument, and an

  19. Wavelength resolved neutron transmission analysis to identify single crystal particles in historical metallurgy

    Science.gov (United States)

    Barzagli, E.; Grazzi, F.; Salvemini, F.; Scherillo, A.; Sato, H.; Shinohara, T.; Kamiyama, T.; Kiyanagi, Y.; Tremsin, A.; Zoppi, Marco

    2014-07-01

    The phase composition and the microstructure of four ferrous Japanese arrows of the Edo period (17th-19th century) has been determined through two complementary neutron techniques: Position-sensitive wavelength-resolved neutron transmission analysis (PS-WRNTA) and time-of-flight neutron diffraction (ToF-ND). Standard ToF-ND technique has been applied by using the INES diffractometer at the ISIS pulsed neutron source in the UK, while the innovative PS-WRNTA one has been performed at the J-PARC neutron source on the BL-10 NOBORU beam line using the high spatial high time resolution neutron imaging detector. With ToF-ND we were able to reach information about the quantitative distribution of the metal and non-metal phases, the texture level, the strain level and the domain size of each of the samples, which are important parameters to gain knowledge about the technological level of the Japanese weapon. Starting from this base of data, the more complex PS-WRNTA has been applied to the same samples. This experimental technique exploits the presence of the so-called Bragg edges, in the time-of-flight spectrum of neutrons transmitted through crystalline materials, to map the microstructural properties of samples. The two techniques are non-invasive and can be easily applied to archaeometry for an accurate microstructure mapping of metal and ceramic artifacts.

  20. Meta-Analysis of Placental Transcriptome Data Identifies a Novel Molecular Pathway Related to Preeclampsia.

    Science.gov (United States)

    van Uitert, Miranda; Moerland, Perry D; Enquobahrie, Daniel A; Laivuori, Hannele; van der Post, Joris A M; Ris-Stalpers, Carrie; Afink, Gijs B

    2015-01-01

    Studies using the placental transcriptome to identify key molecules relevant for preeclampsia are hampered by a relatively small sample size. In addition, they use a variety of bioinformatics and statistical methods, making comparison of findings challenging. To generate a more robust preeclampsia gene expression signature, we performed a meta-analysis on the original data of 11 placenta RNA microarray experiments, representing 139 normotensive and 116 preeclamptic pregnancies. Microarray data were pre-processed and analyzed using standardized bioinformatics and statistical procedures and the effect sizes were combined using an inverse-variance random-effects model. Interactions between genes in the resulting gene expression signature were identified by pathway analysis (Ingenuity Pathway Analysis, Gene Set Enrichment Analysis, Graphite) and protein-protein associations (STRING). This approach has resulted in a comprehensive list of differentially expressed genes that led to a 388-gene meta-signature of preeclamptic placenta. Pathway analysis highlights the involvement of the previously identified hypoxia/HIF1A pathway in the establishment of the preeclamptic gene expression profile, while analysis of protein interaction networks indicates CREBBP/EP300 as a novel element central to the preeclamptic placental transcriptome. In addition, there is an apparent high incidence of preeclampsia in women carrying a child with a mutation in CREBBP/EP300 (Rubinstein-Taybi Syndrome). The 388-gene preeclampsia meta-signature offers a vital starting point for further studies into the relevance of these genes (in particular CREBBP/EP300) and their concomitant pathways as biomarkers or functional molecules in preeclampsia. This will result in a better understanding of the molecular basis of this disease and opens up the opportunity to develop rational therapies targeting the placental dysfunction causal to preeclampsia. PMID:26171964

  1. A method for identifying compromised clients based on DNS traffic analysis

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup; D’Alconzo, Alessandro;

    2016-01-01

    based on DNS traffic analysis. The proposed method identifies suspicious agile DNS mappings, i.e., mappings characterized by fast changing domain names or/and IP addresses, often used by malicious services. The approach discovers clients that have queried domains contained within identified suspicious...... domain-to-IP mappings, thus assisting in pinpointing potentially compromised clients within the network. The proposed approach targets compromised clients in large-scale operational networks. We have evaluated the proposed approach using an extensive set of DNS traffic traces from different operational......DNS is widely abused by Internet criminals in order to provide reliable communication within malicious network infrastructure as well as flexible and resilient hosting of malicious content. This paper presents a novel detection method that can be used for identifying potentially compromised clients...

  2. Genome-wide interaction-based association analysis identified multiple new susceptibility Loci for common diseases.

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2011-03-01

    Full Text Available Genome-wide interaction-based association (GWIBA analysis has the potential to identify novel susceptibility loci. These interaction effects could be missed with the prevailing approaches in genome-wide association studies (GWAS. However, no convincing loci have been discovered exclusively from GWIBA methods, and the intensive computation involved is a major barrier for application. Here, we developed a fast, multi-thread/parallel program named "pair-wise interaction-based association mapping" (PIAM for exhaustive two-locus searches. With this program, we performed a complete GWIBA analysis on seven diseases with stringent control for false positives, and we validated the results for three of these diseases. We identified one pair-wise interaction between a previously identified locus, C1orf106, and one new locus, TEC, that was specific for Crohn's disease, with a Bonferroni corrected P < 0.05 (P = 0.039. This interaction was replicated with a pair of proxy linked loci (P = 0.013 on an independent dataset. Five other interactions had corrected P < 0.5. We identified the allelic effect of a locus close to SLC7A13 for coronary artery disease. This was replicated with a linked locus on an independent dataset (P = 1.09 × 10⁻⁷. Through a local validation analysis that evaluated association signals, rather than locus-based associations, we found that several other regions showed association/interaction signals with nominal P < 0.05. In conclusion, this study demonstrated that the GWIBA approach was successful for identifying novel loci, and the results provide new insights into the genetic architecture of common diseases. In addition, our PIAM program was capable of handling very large GWAS datasets that are likely to be produced in the future.

  3. Patent Network Analysis and Quadratic Assignment Procedures to Identify the Convergence of Robot Technologies

    Science.gov (United States)

    Lee, Woo Jin; Lee, Won Kyung

    2016-01-01

    Because of the remarkable developments in robotics in recent years, technological convergence has been active in this area. We focused on finding patterns of convergence within robot technology using network analysis of patents in both the USPTO and KIPO. To identify the variables that affect convergence, we used quadratic assignment procedures (QAP). From our analysis, we observed the patent network ecology related to convergence and found technologies that have great potential to converge with other robotics technologies. The results of our study are expected to contribute to setting up convergence based R&D policies for robotics, which can lead new innovation. PMID:27764196

  4. A dynamic mechanical analysis technique for porous media

    Science.gov (United States)

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  5. MEASURING THE LEANNESS OF SUPPLIERS USING PRINCIPAL COMPONENT ANALYSIS TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Zare Mehrjerdi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: A technique that helps management to reduce costs and improve quality is ‘lean supply chain management’, which focuses on the elimination of all wastes in every stage of the supply chain and is derived from ‘agile production’. This research aims to assess and rank the suppliers in an auto industry, based upon the concept of ‘production leanness’. The focus of this research is on the suppliers of a company called Touse-Omron Naein. We have examined the literature about leanness, and classified its criteria into ten dimensions and 76 factors. A questionnaire was used to collect the data, and the suppliers were ranked using the principal component analysis (PCA technique.

    AFRIKAANSE OPSOMMING: Lenige voorsieningsbestuur (“lean supply chain management” is ’n tegniek wat bestuur in staat stel om koste te verminder en gehalte te verbeter. Dit fokus op die vermindering van vermorsing op elke stadium van die voorsieningsketting en word afgelei van ratse vervaardiging (“agile production”. Hierdie navorsing poog om leweransiers in ’n motorbedryf te beoordeel aan die hand van die konsep van vervaardigingslenigheid (“production leanness”. Die navorsing fokus op leweransiers van ’n maatskappy genaamd Touse-Omron Naein. ’n Literatuurstudie aangaande lenigheid het gelei tot die klassifikasie van kriteria in tien dimensies en 76 faktore. ’n Vraelys is gebruik om die data te versamel en die leweransiers is in rangvolgorde geplaas aan die hand van die PCA-tegniek.

  6. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40th ∼ 50th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  7. Identifying barriers to patient acceptance of active surveillance: content analysis of online patient communications.

    Directory of Open Access Journals (Sweden)

    Mark V Mishra

    Full Text Available OBJECTIVES: Qualitative research aimed at identifying patient acceptance of active surveillance (AS has been identified as a public health research priority. The primary objective of this study was to determine if analysis of a large-sample of anonymous internet conversations (ICs could be utilized to identify unmet public needs regarding AS. METHODS: English-language ICs regarding prostate cancer (PC treatment with AS from 2002-12 were identified using a novel internet search methodology. Web spiders were developed to mine, aggregate, and analyze content from the world-wide-web for ICs centered on AS. Collection of ICs was not restricted to any specific geographic region of origin. NLP was used to evaluate content and perform a sentiment analysis. Conversations were scored as positive, negative, or neutral. A sentiment index (SI was subsequently calculated according to the following formula to compare temporal trends in public sentiment towards AS: [(# Positive IC/#Total IC-(#Negative IC/#Total IC x 100]. RESULTS: A total of 464 ICs were identified. Sentiment increased from -13 to +2 over the study period. The increase sentiment has been driven by increased patient emphasis on quality-of-life factors and endorsement of AS by national medical organizations. Unmet needs identified in these ICs include: a gap between quantitative data regarding long-term outcomes with AS vs. conventional treatments, desire for treatment information from an unbiased specialist, and absence of public role models managed with AS. CONCLUSIONS: This study demonstrates the potential utility of online patient communications to provide insight into patient preferences and decision-making. Based on our findings, we recommend that multidisciplinary clinics consider including an unbiased specialist to present treatment options and that future decision tools for AS include quantitative data regarding outcomes after AS.

  8. Emergent team roles in organizational meetings: Identifying communication patterns via cluster analysis.

    OpenAIRE

    Lehmann-Willenbrock, N.K.; Beck, S.J.; Kauffeld, S.

    2016-01-01

    Previous team role taxonomies have largely relied on self-report data, focused on functional roles, and described individual predispositions or personality traits. Instead, this study takes a communicative approach and proposes that team roles are produced, shaped, and sustained in communicative behaviors. To identify team roles communicatively, 59 regular organizational meetings were videotaped and analyzed. Cluster analysis revealed five emergent roles: the solution seeker, the problem anal...

  9. Electroretinogram analysis of relative spectral sensitivity in genetically identified dichromatic macaques

    OpenAIRE

    Hanazawa, Akitoshi; Mikami, Akichika; Angelika, Puti Sulistyo; Takenaka, Osamu; Goto, Shunji; Onishi, Akishi; Koike, Satoshi; Yamamori, Tetsuo; Kato, Keichiro; Kondo, Aya; Suryobroto, Bambang; Farajallah, Achmad; Komatsu, Hidehiko

    2001-01-01

    The retinas of macaque monkeys usually contain three types of photopigment, providing them with trichromatic color vision homologous to that of humans. However, we recently used molecular genetic analysis to identify several macaques with a dichromatic genotype. The affected X chromosome of these animals contains a hybrid gene of long-wavelength-sensitive (L) and middle-wavelength-sensitive (M) photopigments instead of separate genes encoding L and M photopigments. The product of the hybrid g...

  10. Network analysis identifies protein clusters of functional importance in juvenile idiopathic arthritis

    OpenAIRE

    Stevens, Adam; Meyer, Stefan; Hanson, Daniel; Clayton, Peter; Donn, Rachelle

    2014-01-01

    Introduction Our objective was to utilise network analysis to identify protein clusters of greatest potential functional relevance in the pathogenesis of oligoarticular and rheumatoid factor negative (RF-ve) polyarticular juvenile idiopathic arthritis (JIA). Methods JIA genetic association data were used to build an interactome network model in BioGRID 3.2.99. The top 10% of this protein:protein JIA Interactome was used to generate a minimal essential network (MEN). Reactome FI Cytoscape 2.83...

  11. Identifying Gender-Preferred Communication Styles within Online Cancer Communities: A Retrospective, Longitudinal Analysis

    OpenAIRE

    Durant, Kathleen T.; McCray, Alexa T.; Charles Safran

    2012-01-01

    BACKGROUND: The goal of this research is to determine if different gender-preferred social styles can be observed within the user interactions at an online cancer community. To achieve this goal, we identify and measure variables that pertain to each gender-specific social style. METHODS AND FINDINGS: We perform social network and statistical analysis on the communication flow of 8,388 members at six different cancer forums over eight years. Kruskal-Wallis tests were conducted to measure the ...

  12. Identifying patterns in treatment response profiles in acute bipolar mania: a cluster analysis approach

    OpenAIRE

    Houston John P; Lipkovich Ilya A; Ahl Jonna

    2008-01-01

    Abstract Background Patients with acute mania respond differentially to treatment and, in many cases, fail to obtain or sustain symptom remission. The objective of this exploratory analysis was to characterize response in bipolar disorder by identifying groups of patients with similar manic symptom response profiles. Methods Patients (n = 222) were selected from a randomized, double-blind study of treatment with olanzapine or divalproex in bipolar I disorder, manic or mixed episode, with or w...

  13. Hot spot analysis applied to identify ecosystem services potential in Lithuania

    Science.gov (United States)

    Pereira, Paulo; Depellegrin, Daniel; Misiune, Ieva

    2016-04-01

    Hot spot analysis are very useful to identify areas with similar characteristics. This is important for a sustainable use of the territory, since we can identify areas that need to be protected, or restored. This is a great advantage in terms of land use planning and management, since we can allocate resources, reduce the economical costs and do a better intervention in the landscape. Ecosystem services (ES) are different according land use. Since landscape is very heterogeneous, it is of major importance understand their spatial pattern and where are located the areas that provide better ES and the others that provide less services. The objective of this work is to use hot-spot analysis to identify areas with the most valuable ES in Lithuania. CORINE land-cover (CLC) of 2006 was used as the main spatial information. This classification uses a grid of 100 m resolution and extracted a total of 31 land use types. ES ranking was carried out based on expert knowledge. They were asked to evaluate the ES potential of each different CLC from 0 (no potential) to 5 (very high potential). Hot spot analysis were evaluated using the Getis-ord test, which identifies cluster analysis available in ArcGIS toolbox. This tool identifies areas with significantly high low values and significant high values at a p level of 0.05. In this work we used hot spot analysis to assess the distribution of providing, regulating cultural and total (sum of the previous 3) ES. The Z value calculated from Getis-ord was used to statistical analysis to access the clusters of providing, regulating cultural and total ES. ES with high Z value show that they have a high number of cluster areas with high potential of ES. The results showed that the Z-score was significantly different among services (Kruskal Wallis ANOVA =834. 607, pareas that showed high and low significant regulating and cultural ES clusters are similar. The spatial distribution of these clusters is very high, which may be attributed to

  14. Robust and discriminating method for face recognition based on correlation technique and independent component analysis model.

    Science.gov (United States)

    Alfalou, A; Brosseau, C

    2011-03-01

    We demonstrate a novel technique for face recognition. Our approach relies on the performances of a strongly discriminating optical correlation method along with the robustness of the independent component analysis (ICA) model. Simulations were performed to illustrate how this algorithm can identify a face with images from the Pointing Head Pose Image Database. While maintaining algorithmic simplicity, this approach based on ICA representation significantly increases the true recognition rate compared to that obtained using our previously developed all-numerical ICA identity recognition method and another method based on optical correlation and a standard composite filter. PMID:21368935

  15. Driving forces of change in environmental indicators an analysis based on divisia index decomposition techniques

    CERN Document Server

    González, Paula Fernández; Presno, Mª José

    2014-01-01

    This book addresses several index decomposition analysis methods to assess progress made by EU countries in the last decade in relation to energy and climate change concerns. Several applications of these techniques are carried out in order to decompose changes in both energy and environmental aggregates. In addition to this, a new methodology based on classical spline approximations is introduced, which provides useful mathematical and statistical properties. Once a suitable set of determinant factors has been identified, these decomposition methods allow the researcher to quantify the respec

  16. Combination of meta-analysis and graph clustering to identify prognostic markers of ESCC

    Directory of Open Access Journals (Sweden)

    Hongyun Gao

    2012-01-01

    Full Text Available Esophageal squamous cell carcinoma (ESCC is one of the most malignant gastrointestinal cancers and occurs at a high frequency rate in China and other Asian countries. Recently, several molecular markers were identified for predicting ESCC. Notwithstanding, additional prognostic markers, with a clear understanding of their underlying roles, are still required. Through bioinformatics, a graph-clustering method by DPClus was used to detect co-expressed modules. The aim was to identify a set of discriminating genes that could be used for predicting ESCC through graph-clustering and GO-term analysis. The results showed that CXCL12, CYP2C9, TGM3, MAL, S100A9, EMP-1 and SPRR3 were highly associated with ESCC development. In our study, all their predicted roles were in line with previous reports, whereby the assumption that a combination of meta-analysis, graph-clustering and GO-term analysis is effective for both identifying differentially expressed genes, and reflecting on their functions in ESCC.

  17. Combination of meta-analysis and graph clustering to identify prognostic markers of ESCC.

    Science.gov (United States)

    Gao, Hongyun; Wang, Lishan; Cui, Shitao; Wang, Mingsong

    2012-04-01

    Esophageal squamous cell carcinoma (ESCC) is one of the most malignant gastrointestinal cancers and occurs at a high frequency rate in China and other Asian countries. Recently, several molecular markers were identified for predicting ESCC. Notwithstanding, additional prognostic markers, with a clear understanding of their underlying roles, are still required. Through bioinformatics, a graph-clustering method by DPClus was used to detect co-expressed modules. The aim was to identify a set of discriminating genes that could be used for predicting ESCC through graph-clustering and GO-term analysis. The results showed that CXCL12, CYP2C9, TGM3, MAL, S100A9, EMP-1 and SPRR3 were highly associated with ESCC development. In our study, all their predicted roles were in line with previous reports, whereby the assumption that a combination of meta-analysis, graph-clustering and GO-term analysis is effective for both identifying differentially expressed genes, and reflecting on their functions in ESCC.

  18. Effective Boolean dynamics analysis to identify functionally important genes in large-scale signaling networks.

    Science.gov (United States)

    Trinh, Hung-Cuong; Kwon, Yung-Keun

    2015-11-01

    Efficiently identifying functionally important genes in order to understand the minimal requirements of normal cellular development is challenging. To this end, a variety of structural measures have been proposed and their effectiveness has been investigated in recent literature; however, few studies have shown the effectiveness of dynamics-based measures. This led us to investigate a dynamic measure to identify functionally important genes, and the effectiveness of which was verified through application on two large-scale human signaling networks. We specifically consider Boolean sensitivity-based dynamics against an update-rule perturbation (BSU) as a dynamic measure. Through investigations on two large-scale human signaling networks, we found that genes with relatively high BSU values show slower evolutionary rate and higher proportions of essential genes and drug targets than other genes. Gene-ontology analysis showed clear differences between the former and latter groups of genes. Furthermore, we compare the identification accuracies of essential genes and drug targets via BSU and five well-known structural measures. Although BSU did not always show the best performance, it effectively identified the putative set of genes, which is significantly different from the results obtained via the structural measures. Most interestingly, BSU showed the highest synergy effect in identifying the functionally important genes in conjunction with other measures. Our results imply that Boolean-sensitive dynamics can be used as a measure to effectively identify functionally important genes in signaling networks.

  19. Association of two techniques of frontal sinus radiographic analysis for human identification

    Directory of Open Access Journals (Sweden)

    Rhonan Ferreira da SILVA

    2009-09-01

    Full Text Available Introduction: The analysis of images with human identificationpurpose is a routine activity in the departments of forensic medicine, especially when is necessary to identify burned bodies, skeletal remains or corpses in advanced stage of decomposition. Case report: The feasibility and reliability of the analysis of the morphoradiographic image of the frontal sinus is showed, displayed in a posteroanterior (PA radiography of skull produced in life compared to another produced post-death. Conclusion: The results obtained in the radiographic comparison through the association of two different techniques of analysis of the frontal sinus allowed a positive correlation of the identity of the disappeared person with the body in an advanced stage of decomposition.

  20. Characteristics of identifying linear dynamic models from impulse response data using Prony analysis

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, D.J.

    1992-12-01

    The purpose of the study was to investigate the characteristics of fitting linear dynamic models to the impulse response of oscillatory dynamic systems using Prony analysis. Many dynamic systems exhibit oscillatory responses with multiple modes of oscillations. Although the underlying dynamics of such systems are often nonlinear, it is frequently possible and very useful to represent the system operating about some set point with a linear model. Derivation of such linear models can be done using two basic approaches: model the system using theoretical derivations and some linearization method such as a Taylor series expansion; or use a curve-fitting technique to optimally fit a linear model to specified system response data. Prony analysis belongs to the second class of system modeling because it is a method of fitting a linear model to the impulse response of a dynamic system. Its parallel formulation inherently makes it well suited for fitting models to oscillatory system data. Such oscillatory dynamic effects occur in large synchronous-generator-based power systems in the form of electromechanical oscillations. To study and characterize these oscillatory dynamics, BPA has developed computer codes to analyze system data using Prony analysis. The objective of this study was to develop a highly detailed understanding of the properties of using Prony analysis to fit models to systems with characteristics often encountered in power systems. This understanding was then extended to develop general ``rules-of-thumb`` for using Prony analysis. The general characteristics were investigated by performing fits to data from known linear models under controlled conditions. The conditions studied include various mathematical solution techniques; different parent system configurations; and a large variety of underlying noise characteristics.

  1. Characteristics of identifying linear dynamic models from impulse response data using Prony analysis

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, D.J.

    1992-12-01

    The purpose of the study was to investigate the characteristics of fitting linear dynamic models to the impulse response of oscillatory dynamic systems using Prony analysis. Many dynamic systems exhibit oscillatory responses with multiple modes of oscillations. Although the underlying dynamics of such systems are often nonlinear, it is frequently possible and very useful to represent the system operating about some set point with a linear model. Derivation of such linear models can be done using two basic approaches: model the system using theoretical derivations and some linearization method such as a Taylor series expansion; or use a curve-fitting technique to optimally fit a linear model to specified system response data. Prony analysis belongs to the second class of system modeling because it is a method of fitting a linear model to the impulse response of a dynamic system. Its parallel formulation inherently makes it well suited for fitting models to oscillatory system data. Such oscillatory dynamic effects occur in large synchronous-generator-based power systems in the form of electromechanical oscillations. To study and characterize these oscillatory dynamics, BPA has developed computer codes to analyze system data using Prony analysis. The objective of this study was to develop a highly detailed understanding of the properties of using Prony analysis to fit models to systems with characteristics often encountered in power systems. This understanding was then extended to develop general rules-of-thumb'' for using Prony analysis. The general characteristics were investigated by performing fits to data from known linear models under controlled conditions. The conditions studied include various mathematical solution techniques; different parent system configurations; and a large variety of underlying noise characteristics.

  2. COMBINED GEOPHYSICAL INVESTIGATION TECHNIQUES TO IDENTIFY BURIED WASTE IN AN UNCONTROLLED LANDFILL AT THE PADUCAH GASEOUS DIFFUSION PLANT, KENTUCKY

    International Nuclear Information System (INIS)

    survey used a 200 megahertz (MHz) antenna to provide the maximum depth penetration and subsurface detail yielding usable signals to a depth of about 6 to 10 feet in this environment and allowed discrimination of objects that were deeper, particularly useful in the southern area of the site where shallow depth metallic debris (primarily roof flashing) complicated interpretation of the EM and magnetic data. Several geophysical anomalies were defined on the contour plots that indicated the presence of buried metal. During the first phase of the project, nine anomalies or anomalous areas were detected. The sizes, shapes, and magnitudes of the anomalies varied considerably, but given the anticipated size of the primary target of the investigation, only the most prominent anomalies were considered as potential caches of 30 to 60 buried drums. After completion of a second phase investigation, only two of the anomalies were of sufficient magnitude, not identifiable with existing known metallic objects such as monitoring wells, and in positions that corresponded to the location of alleged dumping activities and were recommended for further, intrusive investigation. Other important findings, based on the variable frequency EM method and its combination with total field magnetic and GPR data, included the confirmation of the position of the old NSDD, the ability to differentiate between ferrous and non-ferrous anomalies, and the detection of what may be plumes emanating from the landfill cell

  3. Efficient behavior of photosynthetic organelles via Pareto optimality, identifiability, and sensitivity analysis.

    Science.gov (United States)

    Carapezza, Giovanni; Umeton, Renato; Costanza, Jole; Angione, Claudio; Stracquadanio, Giovanni; Papini, Alessio; Lió, Pietro; Nicosia, Giuseppe

    2013-05-17

    In this work, we develop methodologies for analyzing and cross comparing metabolic models. We investigate three important metabolic networks to discuss the complexity of biological organization of organisms, modeling, and system properties. In particular, we analyze these metabolic networks because of their biotechnological and basic science importance: the photosynthetic carbon metabolism in a general leaf, the Rhodobacter spheroides bacterium, and the Chlamydomonas reinhardtii alga. We adopt single- and multi-objective optimization algorithms to maximize the CO 2 uptake rate and the production of metabolites of industrial interest or for ecological purposes. We focus both on the level of genes (e.g., finding genetic manipulations to increase the production of one or more metabolites) and on finding concentration enzymes for improving the CO 2 consumption. We find that R. spheroides is able to absorb an amount of CO 2 until 57.452 mmol h (-1) gDW (-1) , while C. reinhardtii obtains a maximum of 6.7331. We report that the Pareto front analysis proves extremely useful to compare different organisms, as well as providing the possibility to investigate them with the same framework. By using the sensitivity and robustness analysis, our framework identifies the most sensitive and fragile components of the biological systems we take into account, allowing us to compare their models. We adopt the identifiability analysis to detect functional relations among enzymes; we observe that RuBisCO, GAPDH, and FBPase belong to the same functional group, as suggested also by the sensitivity analysis.

  4. Romanian medieval earring analysis by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100μm. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two large earrings

  5. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  6. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag+, Ba2+, and Cd2+ in the concentration range from 10 ng/g to 1 μg/g; for Cu2+ and Pb2+ from 10 ng/g to 5 μg/g; and for Hg2+ from 10 ng/g to 10 μg/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 μg/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag+, Ba2+, Cd2+, Cu2+, and Pb2+, share common binding sites with binding efficiencies varying in the sequence of Pb2+>Cu2+>Ag2+>Cd2+>Ba2+. The binding of Hg2+ involved a different binding site with an increase in binding efficiency in the presence of Ag+. (orig.)

  7. Damage Detection and Analysis in CFRPs Using Acoustic Emission Technique

    Science.gov (United States)

    Whitlow, Travis Laron

    Real time monitoring of damage is an important aspect of life management of critical structures. Acoustic emission (AE) techniques allow for measurement and assessment of damage in real time. Acoustic emission parameters such as signal amplitude and duration were monitored during the loading sequences. Criteria that can indicate the onset of critical damage to the structure were developed. Tracking the damage as it happens gives a better analysis of the failure evolution that will allow for a more accurate determination of structural life. The main challenge is distinguishing between legitimate damage signals and "false positives" which are unrelated to damage growth. Such false positives can be related to electrical noise, friction, or mechanical vibrations. This research focuses on monitoring signals of damage growth in carbon fiber reinforced polymers (CFRPs) and separating the relevant signals from the false ones. In this Dissertation, acoustic emission signals from CFRP specimens were experimentally recorded and analyzed. The objectives of this work are: (1) perform static and fatigue loading of CFRP composite specimens and measure the associated AE signals, (2) accurately determine the AE parameters (energy, frequency, duration, etc.) of signals generated during failure of such specimens, (3) use fiber optic sensors to monitor the strain distribution of the damage zone and relate these changes in strain measurements to AE data.

  8. Comparative Analysis of Different LIDAR System Calibration Techniques

    Science.gov (United States)

    Miller, M.; Habib, A.

    2016-06-01

    With light detection and ranging (LiDAR) now being a crucial tool for engineering products and on the fly spatial analysis, it is necessary for the user community to have standardized calibration methods. The three methods in this study were developed and proven by the Digital Photogrammetry Research Group (DPRG) for airborne LiDAR systems and are as follows; Simplified, Quasi-Rigorous, and Rigorous. In lieu of using expensive control surfaces for calibration, these methods compare overlapping LiDAR strips to estimate the systematic errors. These systematic errors are quantified by these methods and include the lever arm biases, boresight biases, range bias and scan angle scale bias. These three methods comprehensively represent all of the possible flight configurations and data availability and this paper will test the limits of the method with the most assumptions, the simplified calibration, by using data that violates the assumptions it's math model is based on and compares the results to the quasi-rigorous and rigorous techniques. The overarching goal is to provide a LiDAR system calibration that does not require raw measurements which can be carried out with minimal control and flight lines to reduce costs. This testing is unique because the terrain used for calibration does not contain gable roofs, all other LiDAR system calibration testing and development has been done with terrain containing features with high geometric integrity such as gable roofs.

  9. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  10. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  11. An evaluation of wind turbine blade cross section analysis techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Paquette, Joshua A.; Griffith, Daniel Todd; Laird, Daniel L.; Resor, Brian Ray

    2010-03-01

    The blades of a modern wind turbine are critical components central to capturing and transmitting most of the load experienced by the system. They are complex structural items composed of many layers of fiber and resin composite material and typically, one or more shear webs. Large turbine blades being developed today are beyond the point of effective trial-and-error design of the past and design for reliability is always extremely important. Section analysis tools are used to reduce the three-dimensional continuum blade structure to a simpler beam representation for use in system response calculations to support full system design and certification. One model simplification approach is to analyze the two-dimensional blade cross sections to determine the properties for the beam. Another technique is to determine beam properties using static deflections of a full three-dimensional finite element model of a blade. This paper provides insight into discrepancies observed in outputs from each approach. Simple two-dimensional geometries and three-dimensional blade models are analyzed in this investigation. Finally, a subset of computational and experimental section properties for a full turbine blade are compared.

  12. Seismic margin analysis technique for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed.

  13. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  14. Gene expression signature analysis identifies vorinostat as a candidate therapy for gastric cancer.

    Directory of Open Access Journals (Sweden)

    Sofie Claerhout

    Full Text Available BACKGROUND: Gastric cancer continues to be one of the deadliest cancers in the world and therefore identification of new drugs targeting this type of cancer is thus of significant importance. The purpose of this study was to identify and validate a therapeutic agent which might improve the outcomes for gastric cancer patients in the future. METHODOLOGY/PRINCIPAL FINDINGS: Using microarray technology, we generated a gene expression profile of human gastric cancer-specific genes from human gastric cancer tissue samples. We used this profile in the Broad Institute's Connectivity Map analysis to identify candidate therapeutic compounds for gastric cancer. We found the histone deacetylase inhibitor vorinostat as the lead compound and thus a potential therapeutic drug for gastric cancer. Vorinostat induced both apoptosis and autophagy in gastric cancer cell lines. Pharmacological and genetic inhibition of autophagy however, increased the therapeutic efficacy of vorinostat, indicating that a combination of vorinostat with autophagy inhibitors may therapeutically be more beneficial. Moreover, gene expression analysis of gastric cancer identified a collection of genes (ITGB5, TYMS, MYB, APOC1, CBX5, PLA2G2A, and KIF20A whose expression was elevated in gastric tumor tissue and downregulated more than 2-fold by vorinostat treatment in gastric cancer cell lines. In contrast, SCGB2A1, TCN1, CFD, APLP1, and NQO1 manifested a reversed pattern. CONCLUSIONS/SIGNIFICANCE: We showed that analysis of gene expression signature may represent an emerging approach to discover therapeutic agents for gastric cancer, such as vorinostat. The observation of altered gene expression after vorinostat treatment may provide the clue to identify the molecular mechanism of vorinostat and those patients likely to benefit from vorinostat treatment.

  15. Protein functional links in Trypanosoma brucei, identified by gene fusion analysis

    Directory of Open Access Journals (Sweden)

    Trimpalis Philip

    2011-07-01

    Full Text Available Abstract Background Domain or gene fusion analysis is a bioinformatics method for detecting gene fusions in one organism by comparing its genome to that of other organisms. The occurrence of gene fusions suggests that the two original genes that participated in the fusion are functionally linked, i.e. their gene products interact either as part of a multi-subunit protein complex, or in a metabolic pathway. Gene fusion analysis has been used to identify protein functional links in prokaryotes as well as in eukaryotic model organisms, such as yeast and Drosophila. Results In this study we have extended this approach to include a number of recently sequenced protists, four of which are pathogenic, to identify fusion linked proteins in Trypanosoma brucei, the causative agent of African sleeping sickness. We have also examined the evolution of the gene fusion events identified, to determine whether they can be attributed to fusion or fission, by looking at the conservation of the fused genes and of the individual component genes across the major eukaryotic and prokaryotic lineages. We find relatively limited occurrence of gene fusions/fissions within the protist lineages examined. Our results point to two trypanosome-specific gene fissions, which have recently been experimentally confirmed, one fusion involving proteins involved in the same metabolic pathway, as well as two novel putative functional links between fusion-linked protein pairs. Conclusions This is the first study of protein functional links in T. brucei identified by gene fusion analysis. We have used strict thresholds and only discuss results which are highly likely to be genuine and which either have already been or can be experimentally verified. We discuss the possible impact of the identification of these novel putative protein-protein interactions, to the development of new trypanosome therapeutic drugs.

  16. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  17. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    Science.gov (United States)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  18. Shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity.

    Directory of Open Access Journals (Sweden)

    J R Managbanag

    Full Text Available BACKGROUND: Identification of genes that modulate longevity is a major focus of aging-related research and an area of intense public interest. In addition to facilitating an improved understanding of the basic mechanisms of aging, such genes represent potential targets for therapeutic intervention in multiple age-associated diseases, including cancer, heart disease, diabetes, and neurodegenerative disorders. To date, however, targeted efforts at identifying longevity-associated genes have been limited by a lack of predictive power, and useful algorithms for candidate gene-identification have also been lacking. METHODOLOGY/PRINCIPAL FINDINGS: We have utilized a shortest-path network analysis to identify novel genes that modulate longevity in Saccharomyces cerevisiae. Based on a set of previously reported genes associated with increased life span, we applied a shortest-path network algorithm to a pre-existing protein-protein interaction dataset in order to construct a shortest-path longevity network. To validate this network, the replicative aging potential of 88 single-gene deletion strains corresponding to predicted components of the shortest-path longevity network was determined. Here we report that the single-gene deletion strains identified by our shortest-path longevity analysis are significantly enriched for mutations conferring either increased or decreased replicative life span, relative to a randomly selected set of 564 single-gene deletion strains or to the current data set available for the entire haploid deletion collection. Further, we report the identification of previously unknown longevity genes, several of which function in a conserved longevity pathway believed to mediate life span extension in response to dietary restriction. CONCLUSIONS/SIGNIFICANCE: This work demonstrates that shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity and represents the first application of

  19. Space-Time Analysis to Identify Areas at Risk of Mortality from Cardiovascular Disease

    Directory of Open Access Journals (Sweden)

    Poliany C. O. Rodrigues

    2015-01-01

    Full Text Available This study aimed at identifying areas that were at risk of mortality due to cardiovascular disease in residents aged 45 years or older of the cities of Cuiabá and Várzea Grande between 2009 and 2011. We conducted an ecological study of mortality rates related to cardiovascular disease. Mortality rates were calculated for each census tract by the Local Empirical Bayes estimator. High- and low-risk clusters were identified by retrospective space-time scans for each year using the Poisson probability model. We defined the year and month as the temporal analysis unit and the census tracts as the spatial analysis units adjusted by age and sex. The Mann-Whitney U test was used to compare the socioeconomic and environmental variables by risk classification. High-risk clusters showed higher income ratios than low-risk clusters, as did temperature range and atmospheric particulate matter. Low-risk clusters showed higher humidity than high-risk clusters. The Eastern region of Várzea Grande and the central region of Cuiabá were identified as areas at risk of mortality due to cardiovascular disease in individuals aged 45 years or older. High mortality risk was associated with socioeconomic and environmental factors. More high-risk clusters were observed at the end of the dry season.

  20. Design Analysis Rules to Identify Proper Noun from Bengali Sentence for Universal Networking language

    Directory of Open Access Journals (Sweden)

    Md. Syeful Islam

    2014-08-01

    Full Text Available Now-a-days hundreds of millions of people of almost all levels of education and attitudes from different country communicate with each other for different purposes and perform their jobs on internet or other communication medium using various languages. Not all people know all language; therefore it is very difficult to communicate or works on various languages. In this situation the computer scientist introduce various inter language translation program (Machine translation. UNL is such kind of inter language translation program. One of the major problem of UNL is identified a name from a sentence, which is relatively simple in English language, because such entities start with a capital letter. In Bangla we do not have concept of small or capital letters. Thus we find difficulties in understanding whether a word is a proper noun or not. Here we have proposed analysis rules to identify proper noun from a sentence and established post converter which translate the name entity from Bangla to UNL. The goal is to make possible Bangla sentence conversion to UNL and vice versa. UNL system prove that the theoretical analysis of our proposed system able to identify proper noun from Bangla sentence and produce relative Universal word for UNL.

  1. Proteomic analysis of cerebrospinal fluid in California sea lions (Zalophus californianus) with domoic acid toxicosis identifies proteins associated with neurodegeneration.

    Science.gov (United States)

    Neely, Benjamin A; Soper, Jennifer L; Gulland, Frances M D; Bell, P Darwin; Kindy, Mark; Arthur, John M; Janech, Michael G

    2015-12-01

    Proteomic studies including marine mammals are rare, largely due to the lack of fully sequenced genomes. This has hampered the application of these techniques toward biomarker discovery efforts for monitoring of health and disease in these animals. We conducted a pilot label-free LC-MS/MS study to profile and compare the cerebrospinal fluid from California sea lions with domoic acid toxicosis (DAT) and without DAT. Across 11 samples, a total of 206 proteins were identified (FDRlions with DAT: complement C3, complement factor B, dickkopf-3, malate dehydrogenase 1, neuron cell adhesion molecule 1, gelsolin, and neuronal cell adhesion molecule. Immunoblot analysis found reelin to be depressed in the cerebrospinal fluid from California sea lions with DAT. Mice administered domoic acid also had lower hippocampal reelin protein levels suggesting that domoic acid depresses reelin similar to kainic acid. In summary, proteomic analysis of cerebrospinal fluid in marine mammals is a useful tool to characterize the underlying molecular pathology of neurodegenerative disease. All MS data have been deposited in the ProteomeXchange with identifier PXD002105 (http://proteomecentral.proteomexchange.org/dataset/PXD002105).

  2. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  3. Comparative Analysis of Vehicle Make and Model Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Faiza Ayub Syed

    2014-03-01

    Full Text Available Vehicle Make and Model Recognition (VMMR has emerged as a significant element of vision based systems because of its application in access control systems, traffic control and monitoring systems, security systems and surveillance systems, etc. So far a number of techniques have been developed for vehicle recognition. Each technique follows different methodology and classification approaches. The evaluation results highlight the recognition technique with highest accuracy level. In this paper we have pointed out the working of various vehicle make and model recognition techniques and compare these techniques on the basis of methodology, principles, classification approach, classifier and level of recognition After comparing these factors we concluded that Locally Normalized Harris Corner Strengths (LHNS performs best as compared to other techniques. LHNS uses Bayes and K-NN classification approaches for vehicle classification. It extracts information from frontal view of vehicles for vehicle make and model recognition.

  4. Analysis of Jugular Foramen Exposure in the Fallopian Bridge Technique

    OpenAIRE

    Satar, Bulent; Yazar, Fatih; Aykut CEYHAN; Arslan, Hasan Huseyin; Aydin, Sedat

    2009-01-01

    Objective: To analyze the exposure of the jugular foramen afforded by the fallopian bridge technique. Method: The jugular foramen exposure was obtained using the jugular foramen approach combined with the fallopian bridge technique. We applied this technique using 10 temporal bone specimens at a tertiary referral center. The exposure was assessed by means of depth of the dissection field and two separate dissection spaces that were created anteriorly and posteriorly to the facial nerve. Anter...

  5. A meta-analysis of 120 246 individuals identifies 18 new loci for fibrinogen concentration.

    Science.gov (United States)

    de Vries, Paul S; Chasman, Daniel I; Sabater-Lleal, Maria; Chen, Ming-Huei; Huffman, Jennifer E; Steri, Maristella; Tang, Weihong; Teumer, Alexander; Marioni, Riccardo E; Grossmann, Vera; Hottenga, Jouke J; Trompet, Stella; Müller-Nurasyid, Martina; Zhao, Jing Hua; Brody, Jennifer A; Kleber, Marcus E; Guo, Xiuqing; Wang, Jie Jin; Auer, Paul L; Attia, John R; Yanek, Lisa R; Ahluwalia, Tarunveer S; Lahti, Jari; Venturini, Cristina; Tanaka, Toshiko; Bielak, Lawrence F; Joshi, Peter K; Rocanin-Arjo, Ares; Kolcic, Ivana; Navarro, Pau; Rose, Lynda M; Oldmeadow, Christopher; Riess, Helene; Mazur, Johanna; Basu, Saonli; Goel, Anuj; Yang, Qiong; Ghanbari, Mohsen; Willemsen, Gonneke; Rumley, Ann; Fiorillo, Edoardo; de Craen, Anton J M; Grotevendt, Anne; Scott, Robert; Taylor, Kent D; Delgado, Graciela E; Yao, Jie; Kifley, Annette; Kooperberg, Charles; Qayyum, Rehan; Lopez, Lorna M; Berentzen, Tina L; Räikkönen, Katri; Mangino, Massimo; Bandinelli, Stefania; Peyser, Patricia A; Wild, Sarah; Trégouët, David-Alexandre; Wright, Alan F; Marten, Jonathan; Zemunik, Tatijana; Morrison, Alanna C; Sennblad, Bengt; Tofler, Geoffrey; de Maat, Moniek P M; de Geus, Eco J C; Lowe, Gordon D; Zoledziewska, Magdalena; Sattar, Naveed; Binder, Harald; Völker, Uwe; Waldenberger, Melanie; Khaw, Kay-Tee; Mcknight, Barbara; Huang, Jie; Jenny, Nancy S; Holliday, Elizabeth G; Qi, Lihong; Mcevoy, Mark G; Becker, Diane M; Starr, John M; Sarin, Antti-Pekka; Hysi, Pirro G; Hernandez, Dena G; Jhun, Min A; Campbell, Harry; Hamsten, Anders; Rivadeneira, Fernando; Mcardle, Wendy L; Slagboom, P Eline; Zeller, Tanja; Koenig, Wolfgang; Psaty, Bruce M; Haritunians, Talin; Liu, Jingmin; Palotie, Aarno; Uitterlinden, André G; Stott, David J; Hofman, Albert; Franco, Oscar H; Polasek, Ozren; Rudan, Igor; Morange, Pierre-Emmanuel; Wilson, James F; Kardia, Sharon L R; Ferrucci, Luigi; Spector, Tim D; Eriksson, Johan G; Hansen, Torben; Deary, Ian J; Becker, Lewis C; Scott, Rodney J; Mitchell, Paul; März, Winfried; Wareham, Nick J; Peters, Annette; Greinacher, Andreas; Wild, Philipp S; Jukema, J Wouter; Boomsma, Dorret I; Hayward, Caroline; Cucca, Francesco; Tracy, Russell; Watkins, Hugh; Reiner, Alex P; Folsom, Aaron R; Ridker, Paul M; O'Donnell, Christopher J; Smith, Nicholas L; Strachan, David P; Dehghan, Abbas

    2016-01-15

    Genome-wide association studies have previously identified 23 genetic loci associated with circulating fibrinogen concentration. These studies used HapMap imputation and did not examine the X-chromosome. 1000 Genomes imputation provides better coverage of uncommon variants, and includes indels. We conducted a genome-wide association analysis of 34 studies imputed to the 1000 Genomes Project reference panel and including ∼120 000 participants of European ancestry (95 806 participants with data on the X-chromosome). Approximately 10.7 million single-nucleotide polymorphisms and 1.2 million indels were examined. We identified 41 genome-wide significant fibrinogen loci; of which, 18 were newly identified. There were no genome-wide significant signals on the X-chromosome. The lead variants of five significant loci were indels. We further identified six additional independent signals, including three rare variants, at two previously characterized loci: FGB and IRF1. Together the 41 loci explain 3% of the variance in plasma fibrinogen concentration.

  6. The systematic functional analysis of plasmodium protein kinases identifies essential regulators of mosquito transmission

    KAUST Repository

    Tewari, Rita

    2010-10-21

    Although eukaryotic protein kinases (ePKs) contribute to many cellular processes, only three Plasmodium falciparum ePKs have thus far been identified as essential for parasite asexual blood stage development. To identify pathways essential for parasite transmission between their mammalian host and mosquito vector, we undertook a systematic functional analysis of ePKs in the genetically tractable rodent parasite Plasmodium berghei. Modeling domain signatures of conventional ePKs identified 66 putative Plasmodium ePKs. Kinomes are highly conserved between Plasmodium species. Using reverse genetics, we show that 23 ePKs are redundant for asexual erythrocytic parasite development in mice. Phenotyping mutants at four life cycle stages in Anopheles stephensi mosquitoes revealed functional clusters of kinases required for sexual development and sporogony. Roles for a putative SR protein kinase (SRPK) in microgamete formation, a conserved regulator of clathrin uncoating (GAK) in ookinete formation, and a likely regulator of energy metabolism (SNF1/KIN) in sporozoite development were identified. 2010 Elsevier Inc.

  7. Integrating Stakeholder Preferences and GIS-Based Multicriteria Analysis to Identify Forest Landscape Restoration Priorities

    Directory of Open Access Journals (Sweden)

    David Uribe

    2014-02-01

    Full Text Available A pressing question that arises during the planning of an ecological restoration process is: where to restore first? Answering this question is a complex task; it requires a multidimensional approach to consider economic constrains and the preferences of stakeholders. Being the problem of spatial nature, it may be explored effectively through Multicriteria Decision Analysis (MCDA performed in a Geographical Information System (GIS environment. The proposed approach is based on the definition and weighting of multiple criteria for evaluating land suitability. An MCDA-based methodology was used to identify priority areas for Forest Landscape Restoration in the Upper Mixtec region, Oaxaca (Mexico, one of the most degraded areas of Latin America. Socioeconomic and environmental criteria were selected and evaluated. The opinions of four different stakeholder groups were considered: general public, academic, Non-governmental organizations (NGOs and governmental officers. The preferences of these groups were spatially modeled to identify their priorities. The final result was a map that identifies the most preferable sites for restoration, where resources and efforts should be concentrated. MCDA proved to be a very useful tool in collective planning, when alternative sites have to be identified and prioritized to guide the restoration work.

  8. Unscented Kalman filter with parameter identifiability analysis for the estimation of multiple parameters in kinetic models

    Directory of Open Access Journals (Sweden)

    Baker Syed

    2011-01-01

    Full Text Available Abstract In systems biology, experimentally measured parameters are not always available, necessitating the use of computationally based parameter estimation. In order to rely on estimated parameters, it is critical to first determine which parameters can be estimated for a given model and measurement set. This is done with parameter identifiability analysis. A kinetic model of the sucrose accumulation in the sugar cane culm tissue developed by Rohwer et al. was taken as a test case model. What differentiates this approach is the integration of an orthogonal-based local identifiability method into the unscented Kalman filter (UKF, rather than using the more common observability-based method which has inherent limitations. It also introduces a variable step size based on the system uncertainty of the UKF during the sensitivity calculation. This method identified 10 out of 12 parameters as identifiable. These ten parameters were estimated using the UKF, which was run 97 times. Throughout the repetitions the UKF proved to be more consistent than the estimation algorithms used for comparison.

  9. A meta-analysis of 120 246 individuals identifies 18 new loci for fibrinogen concentration.

    Science.gov (United States)

    de Vries, Paul S; Chasman, Daniel I; Sabater-Lleal, Maria; Chen, Ming-Huei; Huffman, Jennifer E; Steri, Maristella; Tang, Weihong; Teumer, Alexander; Marioni, Riccardo E; Grossmann, Vera; Hottenga, Jouke J; Trompet, Stella; Müller-Nurasyid, Martina; Zhao, Jing Hua; Brody, Jennifer A; Kleber, Marcus E; Guo, Xiuqing; Wang, Jie Jin; Auer, Paul L; Attia, John R; Yanek, Lisa R; Ahluwalia, Tarunveer S; Lahti, Jari; Venturini, Cristina; Tanaka, Toshiko; Bielak, Lawrence F; Joshi, Peter K; Rocanin-Arjo, Ares; Kolcic, Ivana; Navarro, Pau; Rose, Lynda M; Oldmeadow, Christopher; Riess, Helene; Mazur, Johanna; Basu, Saonli; Goel, Anuj; Yang, Qiong; Ghanbari, Mohsen; Willemsen, Gonneke; Rumley, Ann; Fiorillo, Edoardo; de Craen, Anton J M; Grotevendt, Anne; Scott, Robert; Taylor, Kent D; Delgado, Graciela E; Yao, Jie; Kifley, Annette; Kooperberg, Charles; Qayyum, Rehan; Lopez, Lorna M; Berentzen, Tina L; Räikkönen, Katri; Mangino, Massimo; Bandinelli, Stefania; Peyser, Patricia A; Wild, Sarah; Trégouët, David-Alexandre; Wright, Alan F; Marten, Jonathan; Zemunik, Tatijana; Morrison, Alanna C; Sennblad, Bengt; Tofler, Geoffrey; de Maat, Moniek P M; de Geus, Eco J C; Lowe, Gordon D; Zoledziewska, Magdalena; Sattar, Naveed; Binder, Harald; Völker, Uwe; Waldenberger, Melanie; Khaw, Kay-Tee; Mcknight, Barbara; Huang, Jie; Jenny, Nancy S; Holliday, Elizabeth G; Qi, Lihong; Mcevoy, Mark G; Becker, Diane M; Starr, John M; Sarin, Antti-Pekka; Hysi, Pirro G; Hernandez, Dena G; Jhun, Min A; Campbell, Harry; Hamsten, Anders; Rivadeneira, Fernando; Mcardle, Wendy L; Slagboom, P Eline; Zeller, Tanja; Koenig, Wolfgang; Psaty, Bruce M; Haritunians, Talin; Liu, Jingmin; Palotie, Aarno; Uitterlinden, André G; Stott, David J; Hofman, Albert; Franco, Oscar H; Polasek, Ozren; Rudan, Igor; Morange, Pierre-Emmanuel; Wilson, James F; Kardia, Sharon L R; Ferrucci, Luigi; Spector, Tim D; Eriksson, Johan G; Hansen, Torben; Deary, Ian J; Becker, Lewis C; Scott, Rodney J; Mitchell, Paul; März, Winfried; Wareham, Nick J; Peters, Annette; Greinacher, Andreas; Wild, Philipp S; Jukema, J Wouter; Boomsma, Dorret I; Hayward, Caroline; Cucca, Francesco; Tracy, Russell; Watkins, Hugh; Reiner, Alex P; Folsom, Aaron R; Ridker, Paul M; O'Donnell, Christopher J; Smith, Nicholas L; Strachan, David P; Dehghan, Abbas

    2016-01-15

    Genome-wide association studies have previously identified 23 genetic loci associated with circulating fibrinogen concentration. These studies used HapMap imputation and did not examine the X-chromosome. 1000 Genomes imputation provides better coverage of uncommon variants, and includes indels. We conducted a genome-wide association analysis of 34 studies imputed to the 1000 Genomes Project reference panel and including ∼120 000 participants of European ancestry (95 806 participants with data on the X-chromosome). Approximately 10.7 million single-nucleotide polymorphisms and 1.2 million indels were examined. We identified 41 genome-wide significant fibrinogen loci; of which, 18 were newly identified. There were no genome-wide significant signals on the X-chromosome. The lead variants of five significant loci were indels. We further identified six additional independent signals, including three rare variants, at two previously characterized loci: FGB and IRF1. Together the 41 loci explain 3% of the variance in plasma fibrinogen concentration. PMID:26561523

  10. Biomechanical analysis of cross-country skiing techniques.

    Science.gov (United States)

    Smith, G A

    1992-09-01

    The development of new techniques for cross-country skiing based on skating movements has stimulated biomechanical research aimed at understanding the various movement patterns, the forces driving the motions, and the mechanical factors affecting performance. Research methods have evolved from two-dimensional kinematic descriptions of classic ski techniques to three-dimensional analyses involving measurement of the forces and energy relations of skating. While numerous skiing projects have been completed, most have focused on either the diagonal stride or the V1 skating technique on uphill terrain. Current understanding of skiing mechanics is not sufficiently complete to adequately assess and optimize an individual skier's technique.

  11. Identifying E-Business Model:A Value Chain-Based Analysis

    Institute of Scientific and Technical Information of China (English)

    ZENG Qingfeng; HUANG Lihua

    2004-01-01

    E-business will change the ways that all companies do business, and most traditional businesses will evolve from their current business model to a combination of place and space via e-business model To choose the proper e-business model becomes the important strategic concern for company to succeed The main objective of this paper is to investigate the analysis framework for identifying e-business model Based on the e-business process, from the value chain to the value net perspective. This paper provides a theoretical framework for identifying e-business models, and results in 11 e-business models. The strategic intend of every e-business model is discussed in the end of this paper. An enterprise e-business model design and implementation can be specified by the combination of one or more among 11 e-business models.

  12. Proteomic analysis of cell lines to identify the irinotecan resistance proteins

    Indian Academy of Sciences (India)

    Xing-Chen Peng; Feng-Ming Gong; Meng Wei; X I Chen; Y E Chen; K E Cheng; Feng Gao; Feng Xu; FENG Bi; Ji-Yan Liu

    2010-12-01

    Chemotherapeutic drug resistance is a frequent cause of treatment failure in colon cancer patients. Several mechanisms have been implicated in drug resistance. However, they are not sufficient to exhaustively account for this resistance emergence. In this study, two-dimensional gel electrophoresis (2-DE) and the PDQuest software analysis were applied to compare the differential expression of irinotecan-resistance-associated protein in human colon adenocarcinoma LoVo cells and irinotecan-resistant LoVo cells (LoVo/irinotecan). The differential protein dots were excised and analysed by ESI-Q-TOF mass spectrometry (MS). Fifteen proteins were identified, including eight proteins with decreased expression and seven proteins with increased expression. The identified known proteins included those that function in diverse biological processes such as cellular transcription, cell apoptosis, electron transport/redox regulation, cell proliferation/differentiation and retinol metabolism pathways. Identification of such proteins could allow improved understanding of the mechanisms leading to the acquisition of chemoresistance.

  13. Model of Appling Data Mining Techniques in identification, segmentation and Analysis of Customers Behaviour of Electronic Banking Services

    OpenAIRE

    Mohammad Khanbabaei; Seyedeh Fatemeh Zeinolabedini

    2013-01-01

    Banks need to identify and analyze the behavior of their customers in order to present electorinc services to them. In high volume customers' data set, data mining techniques can help to gain hidden knowledge for supporting marketing decisions. The main problem is how using data mining and RFM analysis model in identification and analysis of customers' behavior in order to segment and classify and select groups of valuable customers. The proposed model in this paper is based on CRISP – DM sta...

  14. Finite Element Creep-Fatigue Analysis of a Welded Furnace Roll for Identifying Failure Root Cause

    Science.gov (United States)

    Yang, Y. P.; Mohr, W. C.

    2015-11-01

    Creep-fatigue induced failures are often observed in engineering components operating under high temperature and cyclic loading. Understanding the creep-fatigue damage process and identifying failure root cause are very important for preventing such failures and improving the lifetime of engineering components. Finite element analyses including a heat transfer analysis and a creep-fatigue analysis were conducted to model the cyclic thermal and mechanical process of a furnace roll in a continuous hot-dip coating line. Typically, the roll has a short life, heat transfer analysis was conducted to predict the temperature history of the roll by modeling heat convection from hot air inside the furnace. The creep-fatigue analysis was performed by inputting the predicted temperature history and applying mechanical loads. The analysis results showed that the failure was resulted from a creep-fatigue mechanism rather than a creep mechanism. The difference of material properties between the filler metal and the base metal is the root cause for the roll failure, which induces higher creep strain and stress in the interface between the weld and the HAZ.

  15. Gene-network analysis identifies susceptibility genes related to glycobiology in autism.

    Directory of Open Access Journals (Sweden)

    Bert van der Zwaag

    Full Text Available The recent identification of copy-number variation in the human genome has opened up new avenues for the discovery of positional candidate genes underlying complex genetic disorders, especially in the field of psychiatric disease. One major challenge that remains is pinpointing the susceptibility genes in the multitude of disease-associated loci. This challenge may be tackled by reconstruction of functional gene-networks from the genes residing in these loci. We applied this approach to autism spectrum disorder (ASD, and identified the copy-number changes in the DNA of 105 ASD patients and 267 healthy individuals with Illumina Humanhap300 Beadchips. Subsequently, we used a human reconstructed gene-network, Prioritizer, to rank candidate genes in the segmental gains and losses in our autism cohort. This analysis highlighted several candidate genes already known to be mutated in cognitive and neuropsychiatric disorders, including RAI1, BRD1, and LARGE. In addition, the LARGE gene was part of a sub-network of seven genes functioning in glycobiology, present in seven copy-number changes specifically identified in autism patients with limited co-morbidity. Three of these seven copy-number changes were de novo in the patients. In autism patients with a complex phenotype and healthy controls no such sub-network was identified. An independent systematic analysis of 13 published autism susceptibility loci supports the involvement of genes related to glycobiology as we also identified the same or similar genes from those loci. Our findings suggest that the occurrence of genomic gains and losses of genes associated with glycobiology are important contributors to the development of ASD.

  16. Systematic enrichment analysis of gene expression profiling studies identifies consensus pathways implicated in colorectal cancer development

    Directory of Open Access Journals (Sweden)

    Jesús Lascorz

    2011-01-01

    Full Text Available Background: A large number of gene expression profiling (GEP studies on colorectal carcinogenesis have been performed but no reliable gene signature has been identified so far due to the lack of reproducibility in the reported genes. There is growing evidence that functionally related genes, rather than individual genes, contribute to the etiology of complex traits. We used, as a novel approach, pathway enrichment tools to define functionally related genes that are consistently up- or down-regulated in colorectal carcinogenesis. Materials and Methods: We started the analysis with 242 unique annotated genes that had been reported by any of three recent meta-analyses covering GEP studies on genes differentially expressed in carcinoma vs normal mucosa. Most of these genes (218, 91.9% had been reported in at least three GEP studies. These 242 genes were submitted to bioinformatic analysis using a total of nine tools to detect enrichment of Gene Ontology (GO categories or Kyoto Encyclopedia of Genes and Genomes (KEGG pathways. As a final consistency criterion the pathway categories had to be enriched by several tools to be taken into consideration. Results: Our pathway-based enrichment analysis identified the categories of ribosomal protein constituents, extracellular matrix receptor interaction, carbonic anhydrase isozymes, and a general category related to inflammation and cellular response as significantly and consistently overrepresented entities. Conclusions: We triaged the genes covered by the published GEP literature on colorectal carcinogenesis and subjected them to multiple enrichment tools in order to identify the consistently enriched gene categories. These turned out to have known functional relationships to cancer development and thus deserve further investigation.

  17. A sequence-based approach to identify reference genes for gene expression analysis

    Directory of Open Access Journals (Sweden)

    Chari Raj

    2010-08-01

    Full Text Available Abstract Background An important consideration when analyzing both microarray and quantitative PCR expression data is the selection of appropriate genes as endogenous controls or reference genes. This step is especially critical when identifying genes differentially expressed between datasets. Moreover, reference genes suitable in one context (e.g. lung cancer may not be suitable in another (e.g. breast cancer. Currently, the main approach to identify reference genes involves the mining of expression microarray data for highly expressed and relatively constant transcripts across a sample set. A caveat here is the requirement for transcript normalization prior to analysis, and measurements obtained are relative, not absolute. Alternatively, as sequencing-based technologies provide digital quantitative output, absolute quantification ensues, and reference gene identification becomes more accurate. Methods Serial analysis of gene expression (SAGE profiles of non-malignant and malignant lung samples were compared using a permutation test to identify the most stably expressed genes across all samples. Subsequently, the specificity of the reference genes was evaluated across multiple tissue types, their constancy of expression was assessed using quantitative RT-PCR (qPCR, and their impact on differential expression analysis of microarray data was evaluated. Results We show that (i conventional references genes such as ACTB and GAPDH are highly variable between cancerous and non-cancerous samples, (ii reference genes identified for lung cancer do not perform well for other cancer types (breast and brain, (iii reference genes identified through SAGE show low variability using qPCR in a different cohort of samples, and (iv normalization of a lung cancer gene expression microarray dataset with or without our reference genes, yields different results for differential gene expression and subsequent analyses. Specifically, key established pathways in lung

  18. Large-scale gene-centric meta-analysis across 32 studies identifies multiple lipid loci.

    Science.gov (United States)

    Asselbergs, Folkert W; Guo, Yiran; van Iperen, Erik P A; Sivapalaratnam, Suthesh; Tragante, Vinicius; Lanktree, Matthew B; Lange, Leslie A; Almoguera, Berta; Appelman, Yolande E; Barnard, John; Baumert, Jens; Beitelshees, Amber L; Bhangale, Tushar R; Chen, Yii-Der Ida; Gaunt, Tom R; Gong, Yan; Hopewell, Jemma C; Johnson, Toby; Kleber, Marcus E; Langaee, Taimour Y; Li, Mingyao; Li, Yun R; Liu, Kiang; McDonough, Caitrin W; Meijs, Matthijs F L; Middelberg, Rita P S; Musunuru, Kiran; Nelson, Christopher P; O'Connell, Jeffery R; Padmanabhan, Sandosh; Pankow, James S; Pankratz, Nathan; Rafelt, Suzanne; Rajagopalan, Ramakrishnan; Romaine, Simon P R; Schork, Nicholas J; Shaffer, Jonathan; Shen, Haiqing; Smith, Erin N; Tischfield, Sam E; van der Most, Peter J; van Vliet-Ostaptchouk, Jana V; Verweij, Niek; Volcik, Kelly A; Zhang, Li; Bailey, Kent R; Bailey, Kristian M; Bauer, Florianne; Boer, Jolanda M A; Braund, Peter S; Burt, Amber; Burton, Paul R; Buxbaum, Sarah G; Chen, Wei; Cooper-Dehoff, Rhonda M; Cupples, L Adrienne; deJong, Jonas S; Delles, Christian; Duggan, David; Fornage, Myriam; Furlong, Clement E; Glazer, Nicole; Gums, John G; Hastie, Claire; Holmes, Michael V; Illig, Thomas; Kirkland, Susan A; Kivimaki, Mika; Klein, Ronald; Klein, Barbara E; Kooperberg, Charles; Kottke-Marchant, Kandice; Kumari, Meena; LaCroix, Andrea Z; Mallela, Laya; Murugesan, Gurunathan; Ordovas, Jose; Ouwehand, Willem H; Post, Wendy S; Saxena, Richa; Scharnagl, Hubert; Schreiner, Pamela J; Shah, Tina; Shields, Denis C; Shimbo, Daichi; Srinivasan, Sathanur R; Stolk, Ronald P; Swerdlow, Daniel I; Taylor, Herman A; Topol, Eric J; Toskala, Elina; van Pelt, Joost L; van Setten, Jessica; Yusuf, Salim; Whittaker, John C; Zwinderman, A H; Anand, Sonia S; Balmforth, Anthony J; Berenson, Gerald S; Bezzina, Connie R; Boehm, Bernhard O; Boerwinkle, Eric; Casas, Juan P; Caulfield, Mark J; Clarke, Robert; Connell, John M; Cruickshanks, Karen J; Davidson, Karina W; Day, Ian N M; de Bakker, Paul I W; Doevendans, Pieter A; Dominiczak, Anna F; Hall, Alistair S; Hartman, Catharina A; Hengstenberg, Christian; Hillege, Hans L; Hofker, Marten H; Humphries, Steve E; Jarvik, Gail P; Johnson, Julie A; Kaess, Bernhard M; Kathiresan, Sekar; Koenig, Wolfgang; Lawlor, Debbie A; März, Winfried; Melander, Olle; Mitchell, Braxton D; Montgomery, Grant W; Munroe, Patricia B; Murray, Sarah S; Newhouse, Stephen J; Onland-Moret, N Charlotte; Poulter, Neil; Psaty, Bruce; Redline, Susan; Rich, Stephen S; Rotter, Jerome I; Schunkert, Heribert; Sever, Peter; Shuldiner, Alan R; Silverstein, Roy L; Stanton, Alice; Thorand, Barbara; Trip, Mieke D; Tsai, Michael Y; van der Harst, Pim; van der Schoot, Ellen; van der Schouw, Yvonne T; Verschuren, W M Monique; Watkins, Hugh; Wilde, Arthur A M; Wolffenbuttel, Bruce H R; Whitfield, John B; Hovingh, G Kees; Ballantyne, Christie M; Wijmenga, Cisca; Reilly, Muredach P; Martin, Nicholas G; Wilson, James G; Rader, Daniel J; Samani, Nilesh J; Reiner, Alex P; Hegele, Robert A; Kastelein, John J P; Hingorani, Aroon D; Talmud, Philippa J; Hakonarson, Hakon; Elbers, Clara C; Keating, Brendan J; Drenos, Fotios

    2012-11-01

    Genome-wide association studies (GWASs) have identified many SNPs underlying variations in plasma-lipid levels. We explore whether additional loci associated with plasma-lipid phenotypes, such as high-density lipoprotein cholesterol (HDL-C), low-density lipoprotein cholesterol (LDL-C), total cholesterol (TC), and triglycerides (TGs), can be identified by a dense gene-centric approach. Our meta-analysis of 32 studies in 66,240 individuals of European ancestry was based on the custom ∼50,000 SNP genotyping array (the ITMAT-Broad-CARe array) covering ∼2,000 candidate genes. SNP-lipid associations were replicated either in a cohort comprising an additional 24,736 samples or within the Global Lipid Genetic Consortium. We identified four, six, ten, and four unreported SNPs in established lipid genes for HDL-C, LDL-C, TC, and TGs, respectively. We also identified several lipid-related SNPs in previously unreported genes: DGAT2, HCAR2, GPIHBP1, PPARG, and FTO for HDL-C; SOCS3, APOH, SPTY2D1, BRCA2, and VLDLR for LDL-C; SOCS3, UGT1A1, BRCA2, UBE3B, FCGR2A, CHUK, and INSIG2 for TC; and SERPINF2, C4B, GCK, GATA4, INSR, and LPAL2 for TGs. The proportion of explained phenotypic variance in the subset of studies providing individual-level data was 9.9% for HDL-C, 9.5% for LDL-C, 10.3% for TC, and 8.0% for TGs. This large meta-analysis of lipid phenotypes with the use of a dense gene-centric approach identified multiple SNPs not previously described in established lipid genes and several previously unknown loci. The explained phenotypic variance from this approach was comparable to that from a meta-analysis of GWAS data, suggesting that a focused genotyping approach can further increase the understanding of heritability of plasma lipids. PMID:23063622

  19. Analysis of a proposed Compton backscatter imaging technique

    Science.gov (United States)

    Hall, James M.; Jacoby, Barry A.

    1994-03-01

    One-sided imaging techniques are currently being used in nondestructive evaluation of surfaces and shallow subsurface structures. In this work we present both analytical calculations and detailed Monte Carlo simulations aimed at assessing the capability of a proposed Compton backscattering imaging technique designed to detect and characterize voids located several centimeters below the surface of a solid.

  20. A Technique for the Analysis of Auto Exhaust.

    Science.gov (United States)

    Sothern, Ray D.; And Others

    Developed for presentation at the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971, this outline explains a technique for separating the complex mixture of hydrocarbons contained in automotive exhausts. A Golay column and subambient temperature programming technique are…

  1. Cluster analysis for identifying sub-groups and selecting potential discriminatory variables in human encephalitis

    Directory of Open Access Journals (Sweden)

    Crowcroft Natasha S

    2010-12-01

    Full Text Available Abstract Background Encephalitis is an acute clinical syndrome of the central nervous system (CNS, often associated with fatal outcome or permanent damage, including cognitive and behavioural impairment, affective disorders and epileptic seizures. Infection of the central nervous system is considered to be a major cause of encephalitis and more than 100 different pathogens have been recognized as causative agents. However, a large proportion of cases have unknown disease etiology. Methods We perform hierarchical cluster analysis on a multicenter England encephalitis data set with the aim of identifying sub-groups in human encephalitis. We use the simple matching similarity measure which is appropriate for binary data sets and performed variable selection using cluster heatmaps. We also use heatmaps to visually assess underlying patterns in the data, identify the main clinical and laboratory features and identify potential risk factors associated with encephalitis. Results Our results identified fever, personality and behavioural change, headache and lethargy as the main characteristics of encephalitis. Diagnostic variables such as brain scan and measurements from cerebrospinal fluids are also identified as main indicators of encephalitis. Our analysis revealed six major clusters in the England encephalitis data set. However, marked within-cluster heterogeneity is observed in some of the big clusters indicating possible sub-groups. Overall, the results show that patients are clustered according to symptom and diagnostic variables rather than causal agents. Exposure variables such as recent infection, sick person contact and animal contact have been identified as potential risk factors. Conclusions It is in general assumed and is a common practice to group encephalitis cases according to disease etiology. However, our results indicate that patients are clustered with respect to mainly symptom and diagnostic variables rather than causal agents

  2. Use of Antibiotic Resistance Analysis To Identify Nonpoint Sources of Fecal Pollution

    OpenAIRE

    Wiggins, B A; Andrews, R. W.; Conway, R. A.; Corr, C. L.; Dobratz, E. J.; Dougherty, D. P.; Eppard, J. R.; Knupp, S. R.; Limjoco, M. C.; Mettenburg, J. M.; Rinehardt, J. M.; Sonsino, J.; Torrijos, R. L.; Zimmerman, M.E.

    1999-01-01

    A study was conducted to determine the reliability and repeatability of antibiotic resistance analysis as a method of identifying the sources of fecal pollution in surface water and groundwater. Four large sets of isolates of fecal streptococci (from 2,635 to 5,990 isolates per set) were obtained from 236 samples of human sewage and septage, cattle and poultry feces, and pristine waters. The patterns of resistance of the isolates to each of four concentrations of up to nine antibiotics were a...

  3. Differentially expressed genes in pancreatic ductal adenocarcinomas identified through serial analysis of gene expression

    DEFF Research Database (Denmark)

    Hustinx, Steven R; Cao, Dengfeng; Maitra, Anirban;

    2004-01-01

    Serial analysis of gene expression (SAGE) is a powerful tool for the discovery of novel tumor markers. The publicly available online SAGE libraries of normal and neoplastic tissues (http://www.ncbi.nlm.nih.gov/SAGE/) have recently been expanded; in addition, a more complete annotation of the human...... of this program. Novel differentially expressed genes in a cancer type can be identified by revisiting updated and expanded SAGE databases. TAGmapper should prove to be a powerful tool for the discovery of novel tumor markers through assignment of uncharacterized SAGE tags....

  4. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    International Nuclear Information System (INIS)

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm3 and 15×15×15 cm3 were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed

  5. Model-size reduction technique for the analysis of symmetric anisotropic structures

    Science.gov (United States)

    Noor, A. K.; Peters, J. M.

    1985-01-01

    A two-step computational procedure is presented for reducing the size of the analysis model for an anisotropic symmetric structure to that of the corresponding orthotropic structure. The key elements of the procedure are: (1) decomposition of the stiffness matrix into the sum of an orthotropic and nonorthotropic (anisotropic) parts; and (2) successive application of the finite element method and the classical Rayleigh-Ritz technique. The finite element method is first used to generate few global approximation vectors (or modes). Then the amplitudes of these modes are computed by using the Rayleigh-Ritz technique. The global approximation vectors are selected to be the solution corresponding to zero nonorthotropic matrix and its various-order derivatives with respect to an anisotropic tracing parameter (identifying the nonorthotropic material coefficients). The size of the analysis model used in generating the global approximation vectors is identical to that of the corresponding orthotropic structure. The effectiveness of the proposed technique is demonstrated by means of numerical examples and its potential for solving other quasi-symmetric problems is discussed.

  6. Noise Reduction Analysis of Radar Rainfall Using Chaotic Dynamics and Filtering Techniques

    Directory of Open Access Journals (Sweden)

    Soojun Kim

    2014-01-01

    Full Text Available The aim of this study is to evaluate the filtering techniques which can remove the noise involved in the time series. For this, Logistic series which is chaotic series and radar rainfall series are used for the evaluation of low-pass filter (LF and Kalman filter (KF. The noise is added to Logistic series by considering noise level and the noise added series is filtered by LF and KF for the noise reduction. The analysis for the evaluation of LF and KF techniques is performed by the correlation coefficient, standard error, the attractor, and the BDS statistic from chaos theory. The analysis result for Logistic series clearly showed that KF is better tool than LF for removing the noise. Also, we used the radar rainfall series for evaluating the noise reduction capabilities of LF and KF. In this case, it was difficult to distinguish which filtering technique is better way for noise reduction when the typical statistics such as correlation coefficient and standard error were used. However, when the attractor and the BDS statistic were used for evaluating LF and KF, we could clearly identify that KF is better than LF.

  7. Analysis of the Quintilii’s Villa Bronzes by Spectroscopy Techniques

    Directory of Open Access Journals (Sweden)

    Fabio Stranges

    2014-01-01

    Full Text Available The aim of this work is the characterization, with different diagnostic tests, of three fragments of bronze artefacts recovered from the Villa of the Quintilii (located in the south of Rome. In particular, the sample alloys were investigated by different chemical and morphological analysis. Firstly, an analysis of the alloy, implemented through the electronic spectroscopy, was taken to discriminate the bronze morphology and its elemental composition. Subsequently, a surface analysis was realized by molecular spectroscopy to identify the alteration patinas on surfaces (such as bronze disease. Two diagnostic techniques are used for the alloy analysis: scanning electron microscopy (SEM connected to the EDX spectroscopy (to study the morphology and alloy composition and Auger electron spectroscopy (AES (to identify the oxidation state of each element. Moreover, for the study of surface patinas, IR and Raman spectroscopies were implemented. All studies were performed on the “as received” samples, covered by a thin layer of excavated soil and on samples processed in an aqueous solution of sulphuric acid (10%, to remove patinas and alterations.

  8. Identifying Chemistry Prospective Teachers' Difficulties Encountered in Practice of The Subject Area Textbook Analysis Course

    Directory of Open Access Journals (Sweden)

    Zeynep Bak Kibar

    2010-12-01

    Full Text Available Prospective teachers should already be aware of possible mistakes in the textbooks and have knowledge of textbooks selection procedure and criteria. These knowledge is tried to being gained to prospective teachers at the Subject Area Textbook Analysis Course. It is important to identify the difficulties they encountered and the skills they gained from the point of implementing effectively this lesson. To research these problems, a case study was realized with 38 student teachers from Department of Secondary Science and Mathematics Education Chemistry Teaching Program at the Karadeniz Technical University Faculty of Fatih Education. Results suggest that prospective teachers gained the knowledge of research, teaching life, writing report, and analyzing textbook. Also, it was determined that they had difficulties in group working, literature reviewing, report writing, analyzing textbook, and critical analysis.

  9. Towards a typology of business process management professionals: identifying patterns of competences through latent semantic analysis

    Science.gov (United States)

    Müller, Oliver; Schmiedel, Theresa; Gorbacheva, Elena; vom Brocke, Jan

    2016-01-01

    While researchers have analysed the organisational competences that are required for successful Business Process Management (BPM) initiatives, individual BPM competences have not yet been studied in detail. In this study, latent semantic analysis is used to examine a collection of 1507 BPM-related job advertisements in order to develop a typology of BPM professionals. This empirical analysis reveals distinct ideal types and profiles of BPM professionals on several levels of abstraction. A closer look at these ideal types and profiles confirms that BPM is a boundary-spanning field that requires interdisciplinary sets of competence that range from technical competences to business and systems competences. Based on the study's findings, it is posited that individual and organisational alignment with the identified ideal types and profiles is likely to result in high employability and organisational BPM success.

  10. Whole Genome Analysis of Injectional Anthrax Identifies Two Disease Clusters Spanning More Than 13 Years

    Directory of Open Access Journals (Sweden)

    Paul Keim

    2015-11-01

    Lay Person Interpretation: Injectional anthrax has been plaguing heroin drug users across Europe for more than 10 years. In order to better understand this outbreak, we assessed genomic relationships of all available injectional anthrax strains from four countries spanning a >12 year period. Very few differences were identified using genome-based analysis, but these differentiated the isolates into two distinct clusters. This strongly supports a hypothesis of at least two separate anthrax spore contamination events perhaps during the drug production processes. Identification of two events would not have been possible from standard epidemiological analysis. These comprehensive data will be invaluable for classifying future injectional anthrax isolates and for future geographic attribution.

  11. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N;

    2013-01-01

    fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  12. Independent component analysis of high-resolution imaging data identifies distinct functional domains

    DEFF Research Database (Denmark)

    Reidl, Juergen; Starke, Jens; Omer, David;

    2007-01-01

    . Here we demonstrate that principal component analysis (PCA) followed by spatial independent component analysis (sICA), can be exploited to reduce the dimensionality of data sets recorded in the olfactory bulb and the somatosensory cortex of mice as well as the visual cortex of monkeys, without loosing...... the stimulus specific responses. Different neuronal populations are separated based on their stimulus specific time courses of activation. Both, spatial and temporal response characteristics can be objectively obtained, simultaneously. In the olfactory bulb, groups of glomeruli with different response...... latencies can be identified. This is shown for recordings of olfactory receptor neuron input measured with a calcium sensitive axon tracer and for network dynamics measured with the voltage sensitive dye RH 1838. In the somatosensory cortex, barrels responding to the stimulation of single whiskers can...

  13. Hot spot analysis applied to identify ecosystem services potential in Lithuania

    Science.gov (United States)

    Pereira, Paulo; Depellegrin, Daniel; Misiune, Ieva

    2016-04-01

    Hot spot analysis are very useful to identify areas with similar characteristics. This is important for a sustainable use of the territory, since we can identify areas that need to be protected, or restored. This is a great advantage in terms of land use planning and management, since we can allocate resources, reduce the economical costs and do a better intervention in the landscape. Ecosystem services (ES) are different according land use. Since landscape is very heterogeneous, it is of major importance understand their spatial pattern and where are located the areas that provide better ES and the others that provide less services. The objective of this work is to use hot-spot analysis to identify areas with the most valuable ES in Lithuania. CORINE land-cover (CLC) of 2006 was used as the main spatial information. This classification uses a grid of 100 m resolution and extracted a total of 31 land use types. ES ranking was carried out based on expert knowledge. They were asked to evaluate the ES potential of each different CLC from 0 (no potential) to 5 (very high potential). Hot spot analysis were evaluated using the Getis-ord test, which identifies cluster analysis available in ArcGIS toolbox. This tool identifies areas with significantly high low values and significant high values at a p level of 0.05. In this work we used hot spot analysis to assess the distribution of providing, regulating cultural and total (sum of the previous 3) ES. The Z value calculated from Getis-ord was used to statistical analysis to access the clusters of providing, regulating cultural and total ES. ES with high Z value show that they have a high number of cluster areas with high potential of ES. The results showed that the Z-score was significantly different among services (Kruskal Wallis ANOVA =834. 607, p<0.001). The Z score of providing services (0.096±2.239) were significantly higher than the total (0.093±2.045), cultural (0.080±1.979) and regulating (0.076±1.961). These

  14. A COMPARISON OF STEPWISE AND FUZZY MULTIPLE REGRESSION ANALYSIS TECHNIQUES FOR MANAGING SOFTWARE PROJECT RISKS: ANALYSIS PHASE

    Directory of Open Access Journals (Sweden)

    Abdelrafe Elzamly

    2014-01-01

    Full Text Available Risk is not always avoidable, but it is controllable. The aim of this study is to identify whether those techniques are effective in reducing software failure. This motivates the authors to continue the effort to enrich the managing software project risks with consider mining and quantitative approach with large data set. In this study, two new techniques are introduced namely stepwise multiple regression analysis and fuzzy multiple regression to manage the software risks. Two evaluation procedures such as MMRE and Pred (25 is used to compare the accuracy of techniques. The model’s accuracy slightly improves in stepwise multiple regression rather than fuzzy multiple regression. This study will guide software managers to apply software risk management practices with real world software development organizations and verify the effectiveness of the new techniques and approaches on a software project. The study has been conducted on a group of software project using survey questionnaire. It is hope that this will enable software managers improve their decision to increase the probability of software project success.

  15. Identifying past fire regimes throughout the Holocene in Ireland using new and established methods of charcoal analysis

    Science.gov (United States)

    Hawthorne, Donna; Mitchell, Fraser J. G.

    2016-04-01

    Globally, in recent years there has been an increase in the scale, intensity and level of destruction caused by wildfires. This can be seen in Ireland where significant changes in vegetation, land use, agriculture and policy, have promoted an increase in fires in the Irish landscape. This study looks at wildfire throughout the Holocene and draws on lacustrine charcoal records from seven study sites spread across Ireland, to reconstruct the past fire regimes recorded at each site. This work utilises new and accepted methods of fire history reconstruction to provide a recommended analytical procedure for statistical charcoal analysis. Digital charcoal counting was used and fire regime reconstructions carried out via the CharAnalysis programme. To verify this record new techniques are employed; an Ensemble-Member strategy to remove the objectivity associated with parameter selection, a Signal to Noise Index to determine if the charcoal record is appropriate for peak detection, and a charcoal peak screening procedure to validate the identified fire events based on bootstrapped samples. This analysis represents the first study of its kind in Ireland, examining the past record of fire on a multi-site and paleoecological timescale, and will provide a baseline level of data which can be built on in the future when the frequency and intensity of fire is predicted to increase.

  16. Multidimensional scaling technique for analysis of magnetic storms at Indian observatories

    Indian Academy of Sciences (India)

    M Sridharan; A M S Ramasamy

    2002-12-01

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  17. Friction force microscopy: a simple technique for identifying graphene on rough substrates and mapping the orientation of graphene grains on copper

    Science.gov (United States)

    Marsden, A. J.; Phillips, M.; Wilson, N. R.

    2013-06-01

    At a single atom thick, it is challenging to distinguish graphene from its substrate using conventional techniques. In this paper we show that friction force microscopy (FFM) is a simple and quick technique for identifying graphene on a range of samples, from growth substrates to rough insulators. We show that FFM is particularly effective for characterizing graphene grown on copper where it can correlate the graphene growth to the three-dimensional surface topography. Atomic lattice stick-slip friction is readily resolved and enables the crystallographic orientation of the graphene to be mapped nondestructively, reproducibly and at high resolution. We expect FFM to be similarly effective for studying graphene growth on other metal/locally crystalline substrates, including SiC, and for studying growth of other two-dimensional materials such as molybdenum disulfide and hexagonal boron nitride.

  18. SRAM standby leakage decoupling analysis for different leakage reduction techniques

    Institute of Scientific and Technical Information of China (English)

    Dong Qing; Lin Yinyin

    2013-01-01

    SRAM standby leakage reduction plays a pivotal role in minimizing the power consumption of application processors.Generally,four kinds of techniques are often utilized for SRAM standby leakage reduction:Vdd lowering (VDDL),Vss rising (VSSR),BL floating (BLF) and reversing body bias (RBB).In this paper,we comprehensively analyze and compare the reduction effects of these techniques on different kinds of leakage.It is disclosed that the performance of these techniques depends on the leakage composition of the SRAM cell and temperature.This has been verified on a 65 nm SRAM test macro.

  19. Surveillance of the nuclear instrumentation by a noise analysis technique

    International Nuclear Information System (INIS)

    The nuclear sensors used in the protection channels of a nuclear reactor, have to be tested periodically. A method has been developed to estimate the state of this kind of sensor. The method proposed applies to boron ionization chambers. The principle of this technique is based on the calculation of a specific parameter named a ''descriptor'', using a simple signal processing technique. A modification of this parameter indicates a degradation of the static and dynamic performances of the sensor. Different applications of the technique in a nuclear power plant are given

  20. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  1. Identifying Talent in Youth Sport: A Novel Methodology Using Higher-Dimensional Analysis.

    Directory of Open Access Journals (Sweden)

    Kevin Till

    Full Text Available Prediction of adult performance from early age talent identification in sport remains difficult. Talent identification research has generally been performed using univariate analysis, which ignores multivariate relationships. To address this issue, this study used a novel higher-dimensional model to orthogonalize multivariate anthropometric and fitness data from junior rugby league players, with the aim of differentiating future career attainment. Anthropometric and fitness data from 257 Under-15 rugby league players was collected. Players were grouped retrospectively according to their future career attainment (i.e., amateur, academy, professional. Players were blindly and randomly divided into an exploratory (n = 165 and validation dataset (n = 92. The exploratory dataset was used to develop and optimize a novel higher-dimensional model, which combined singular value decomposition (SVD with receiver operating characteristic analysis. Once optimized, the model was tested using the validation dataset. SVD analysis revealed 60 m sprint and agility 505 performance were the most influential characteristics in distinguishing future professional players from amateur and academy players. The exploratory dataset model was able to distinguish between future amateur and professional players with a high degree of accuracy (sensitivity = 85.7%, specificity = 71.1%; p<0.001, although it could not distinguish between future professional and academy players. The validation dataset model was able to distinguish future professionals from the rest with reasonable accuracy (sensitivity = 83.3%, specificity = 63.8%; p = 0.003. Through the use of SVD analysis it was possible to objectively identify criteria to distinguish future career attainment with a sensitivity over 80% using anthropometric and fitness data alone. As such, this suggests that SVD analysis may be a useful analysis tool for research and practice within talent identification.

  2. Identifying Talent in Youth Sport: A Novel Methodology Using Higher-Dimensional Analysis.

    Science.gov (United States)

    Till, Kevin; Jones, Ben L; Cobley, Stephen; Morley, David; O'Hara, John; Chapman, Chris; Cooke, Carlton; Beggs, Clive B

    2016-01-01

    Prediction of adult performance from early age talent identification in sport remains difficult. Talent identification research has generally been performed using univariate analysis, which ignores multivariate relationships. To address this issue, this study used a novel higher-dimensional model to orthogonalize multivariate anthropometric and fitness data from junior rugby league players, with the aim of differentiating future career attainment. Anthropometric and fitness data from 257 Under-15 rugby league players was collected. Players were grouped retrospectively according to their future career attainment (i.e., amateur, academy, professional). Players were blindly and randomly divided into an exploratory (n = 165) and validation dataset (n = 92). The exploratory dataset was used to develop and optimize a novel higher-dimensional model, which combined singular value decomposition (SVD) with receiver operating characteristic analysis. Once optimized, the model was tested using the validation dataset. SVD analysis revealed 60 m sprint and agility 505 performance were the most influential characteristics in distinguishing future professional players from amateur and academy players. The exploratory dataset model was able to distinguish between future amateur and professional players with a high degree of accuracy (sensitivity = 85.7%, specificity = 71.1%; p<0.001), although it could not distinguish between future professional and academy players. The validation dataset model was able to distinguish future professionals from the rest with reasonable accuracy (sensitivity = 83.3%, specificity = 63.8%; p = 0.003). Through the use of SVD analysis it was possible to objectively identify criteria to distinguish future career attainment with a sensitivity over 80% using anthropometric and fitness data alone. As such, this suggests that SVD analysis may be a useful analysis tool for research and practice within talent identification. PMID:27224653

  3. Identifying the oil price-macroeconomy relationship: An empirical mode decomposition analysis of US data

    International Nuclear Information System (INIS)

    This paper employs the empirical mode decomposition (EMD) method to filter cyclical components of US quarterly gross domestic product (GDP) and quarterly average oil price (West Texas Intermediate-WTI). The method is adaptive and applicable to non-linear and non-stationary data. A correlation analysis of the resulting components is performed and examined for insights into the relationship between oil and the economy. Several components of this relationship are identified. However, the principal one is that the medium-run component of the oil price has a negative relationship with the main cyclical component of the GDP. In addition, weak correlations suggesting a lagging, demand-driven component and a long-run component of the relationship were also identified. Comparisons of these findings with significant oil supply disruption and recession dates were supportive. The study identifies a number of lessons applicable to recent oil market events, including the eventuality of persistent oil price and economic decline following a long oil price run-up. In addition, it was found that oil market related exogenous events are associated with short- to medium-run price implications regardless of whether they lead to actual supply losses.

  4. Identifying the oil price-macroeconomy relationship. An empirical mode decomposition analysis of US data

    International Nuclear Information System (INIS)

    This paper employs the empirical mode decomposition (EMD) method to filter cyclical components of US quarterly gross domestic product (GDP) and quarterly average oil price (West Texas Intermediate - WTI). The method is adaptive and applicable to non-linear and non-stationary data. A correlation analysis of the resulting components is performed and examined for insights into the relationship between oil and the economy. Several components of this relationship are identified. However, the principal one is that the medium-run component of the oil price has a negative relationship with the main cyclical component of the GDP. In addition, weak correlations suggesting a lagging, demand-driven component and a long-run component of the relationship were also identified. Comparisons of these findings with significant oil supply disruption and recession dates were supportive. The study identifies a number of lessons applicable to recent oil market events, including the eventuality of persistent oil price and economic decline following a long oil price run-up. In addition, it was found that oil market related exogenous events are associated with short- to medium-run price implications regardless of whether they lead to actual supply losses. (author)

  5. Genome-wide association scan meta-analysis identifies three Loci influencing adiposity and fat distribution.

    Directory of Open Access Journals (Sweden)

    Cecilia M Lindgren

    2009-06-01

    Full Text Available To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580 informative for adult waist circumference (WC and waist-hip ratio (WHR. We selected 26 SNPs for follow-up, for which the evidence of association with measures of central adiposity (WC and/or WHR was strong and disproportionate to that for overall adiposity or height. Follow-up studies in a maximum of 70,689 individuals identified two loci strongly associated with measures of central adiposity; these map near TFAP2B (WC, P = 1.9x10(-11 and MSRA (WC, P = 8.9x10(-9. A third locus, near LYPLAL1, was associated with WHR in women only (P = 2.6x10(-8. The variants near TFAP2B appear to influence central adiposity through an effect on overall obesity/fat-mass, whereas LYPLAL1 displays a strong female-only association with fat distribution. By focusing on anthropometric measures of central obesity and fat distribution, we have identified three loci implicated in the regulation of human adiposity.

  6. Computational EST database analysis identifies a novel member of the neuropoietic cytokine family.

    Science.gov (United States)

    Shi, Y; Wang, W; Yourey, P A; Gohari, S; Zukauskas, D; Zhang, J; Ruben, S; Alderson, R F

    1999-08-19

    A novel member of the neuropoietic cytokine family has been cloned and the protein expressed and characterized. In an effort to identify novel secreted proteins, an algorithm incorporating neural network algorithms was applied to a large EST database. A full-length clone was identified that is 1710 bp in length and has a single open reading frame of 225 amino acids. This new cytokine is most homologous to cardiotrophin-1, having a similarity and an identity of 46 and 29%, respectively, and therefore we have named it cardiotrophin-like cytokine (CLC). Northern hybridization analysis identified a 1.4-kb messenger RNA that is highly expressed in spleen and peripheral leukocytes. Purified recombinant CLC induced the activation of NFkappaB and SRE reporter constructs in the TF-1, U937, and M1 cell lines. Furthermore, the signal transduction pathway for CLC was characterized in the neuroblastoma cell line SK-N-MC and found to involve tyrosine phosphorylation of gp130 and STAT-1. PMID:10448081

  7. Genome-wide association study meta-analysis identifies seven new rheumatoid arthritis risk loci

    Science.gov (United States)

    Stahl, Eli A.; Raychaudhuri, Soumya; Remmers, Elaine F.; Xie, Gang; Eyre, Stephen; Thomson, Brian P.; Li, Yonghong; Kurreeman, Fina A. S.; Zhernakova, Alexandra; Hinks, Anne; Guiducci, Candace; Chen, Robert; Alfredsson, Lars; Amos, Christopher I.; Ardlie, Kristin G.; Barton, Anne; Bowes, John; Brouwer, Elisabeth; Burtt, Noel P.; Catanese, Joseph J.; Coblyn, Jonathan; Coenen, Marieke JH; Costenbader, Karen H.; Criswell, Lindsey A.; Crusius, J. Bart A.; Cui, Jing; de Bakker, Paul I.W.; De Jager, Phillip L.; Ding, Bo; Emery, Paul; Flynn, Edward; Harrison, Pille; Hocking, Lynne J.; Huizinga, Tom W. J.; Kastner, Daniel L.; Ke, Xiayi; Lee, Annette T.; Liu, Xiangdong; Martin, Paul; Morgan, Ann W.; Padyukov, Leonid; Posthumus, Marcel D.; Radstake, Timothy RDJ; Reid, David M.; Seielstad, Mark; Seldin, Michael F.; Shadick, Nancy A.; Steer, Sophia; Tak, Paul P.; Thomson, Wendy; van der Helm-van Mil, Annette H. M.; van der Horst-Bruinsma, Irene E.; van der Schoot, C. Ellen; van Riel, Piet LCM; Weinblatt, Michael E.; Wilson, Anthony G.; Wolbink, Gert Jan; Wordsworth, Paul; Wijmenga, Cisca; Karlson, Elizabeth W.; Toes, Rene E. M.; de Vries, Niek; Begovich, Ann B.; Worthington, Jane; Siminovitch, Katherine A.; Gregersen, Peter K.; Klareskog, Lars; Plenge, Robert M.

    2014-01-01

    To identify novel genetic risk factors for rheumatoid arthritis (RA), we conducted a genome-wide association study (GWAS) meta-analysis of 5,539 autoantibody positive RA cases and 20,169 controls of European descent, followed by replication in an independent set of 6,768 RA cases and 8,806 controls. Of 34 SNPs selected for replication, 7 novel RA risk alleles were identified at genome-wide significance (P<5×10−8) in analysis of all 41,282 samples. The associated SNPs are near genes of known immune function, including IL6ST, SPRED2, RBPJ, CCR6, IRF5, and PXK. We also refined the risk alleles at two established RA risk loci (IL2RA and CCL21) and confirmed the association at AFF3. These new associations bring the total number of confirmed RA risk loci to 31 among individuals of European ancestry. An additional 11 SNPs replicated at P<0.05, many of which are validated autoimmune risk alleles, suggesting that most represent bona fide RA risk alleles. PMID:20453842

  8. Using FAME Analysis to Compare, Differentiate, and Identify Multiple Nematode Species

    Science.gov (United States)

    Sekora, Nicholas S.; Agudelo, Paula; van Santen, Edzard; McInroy, John A.

    2009-01-01

    We have adapted the Sherlock® Microbial Identification system for identification of plant parasitic nematodes based on their fatty acid profiles. Fatty acid profiles of 12 separate plant parasitic nematode species have been determined using this system. Additionally, separate profiles have been developed for Rotylenchulus reniformis and Meloidogyne incognita based on their host plant, four species and three races within the Meloidogyne genus, and three life stages of Heterodera glycines. Statistically, 85% of these profiles can be delimited from one another; the specific comparisons between the cyst and vermiform stages of H. glycines, M. hapla and M. arenaria, and M. arenaria and M. javanica cannot be segregated using canonical analysis. By incorporating each of these fatty acid profiles into the Sherlock® Analysis Software, 20 library entries were created. While there was some similarity among profiles, all entries correctly identified the proper organism to genus, species, race, life stage, and host at greater than 86% accuracy. The remaining 14% were correctly identified to genus, although species and race may not be correct due to the underlying variables of host or life stage. These results are promising and indicate that this library could be used for diagnostics labs to increase response time. PMID:22736811

  9. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    Science.gov (United States)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology

  10. Proteomic analysis identifies interleukin 11 regulated plasma membrane proteins in human endometrial epithelial cells in vitro

    Directory of Open Access Journals (Sweden)

    Stanton Peter G

    2011-05-01

    Full Text Available Abstract Background During the peri-implantation period, the embryo adheres to an adequately prepared or receptive endometrial surface epithelium. Abnormal embryo adhesion to the endometrium results in embryo implantation failure and infertility. Endometrial epithelial cell plasma membrane proteins critical in regulating adhesion may potentially be infertility biomarkers or targets for treating infertility. Interleukin (IL 11 regulates human endometrial epithelial cells (hEEC adhesion. Its production is abnormal in women with infertility. The objective of the study was to identify IL11 regulated plasma membrane proteins in hEEC in vitro using a proteomic approach. Methods Using a 2D-differential in-gel electrophoresis (DIGE electrophoresis combined with LCMS/MS mass spectrometry approach, we identified 20 unique plasma membrane proteins differentially regulated by IL11 in ECC-1 cells, a hEEC derived cell line. Two IL11 regulated proteins with known roles in cell adhesion, annexin A2 (ANXA2 and flotillin-1 (FLOT1, were validated by Western blot and immunocytochemistry in hEEC lines (ECC-1 and an additional cell line, Ishikawa and primary hEEC. Flotilin-1 was further validated by immunohistochemistry in human endometrium throughout the menstrual cycle (n = 6-8/cycle. Results 2D-DIGE analysis identified 4 spots that were significantly different between control and IL11 treated group. Of these 4 spots, there were 20 proteins that were identified with LCMS/MS. Two proteins; ANXA2 and FLOT1 were chosen for further analyses and have found to be significantly up-regulated following IL11 treatment. Western blot analysis showed a 2-fold and a 2.5-fold increase of ANXA2 in hEEC membrane fraction of ECC-1 and Ishikawa cells respectively. Similarly, a 1.8-fold and a 2.3/2.4-fold increase was also observed for FLOT1 in hEEC membrane fraction of ECC-1 and Ishikawa cells respectively. In vitro, IL11 induced stronger ANXA2 expression on cell surface of primary h

  11. The use of environmental monitoring as a technique to identify isotopic enrichment activities; O uso da monitoracao ambiental como tecnica de identificacao de atividades de enriquecimento isotopico

    Energy Technology Data Exchange (ETDEWEB)

    Buchmann, Jose Henrique

    2000-07-01

    The use of environmental monitoring as a technique to identify activities related to the nuclear fuel cycle has been proposed, by international organizations, as an additional measure to the safeguards agreements in force. The elements specific for each kind of nuclear activity, or nuclear signatures, inserted in the ecosystem by several transfer paths, can be intercepted with better or worse ability by different live organisms. Depending on the kind of signature of interest, the anthropogenic material identification and quantification require the choice of adequate biologic indicators and, mainly, the use of sophisticated techniques associated with elaborate sample treatments. This work demonstrates the technical viability of using pine needles as bioindicators of nuclear signatures associated with uranium enrichment activities. Additionally, it proposes the use of a technique widely diffused nowadays in the scientific community, the High Resolution Inductively Coupled Plasma Mass Spectrometer (HR-ICP-MS), to identify the signature corresponding to that kind of activities in the ecosystem. It can be also found a description of a methodology recently being applied in analytical chemistry,based on uncertainties estimates metrological concepts, used to calculate the uncertainties associated with the obtained measurement results. Nitric acid solutions with a concentration of 0.3 mol.kg{sup -1}, used to wash pine needles sampled near facilities that manipulate enriched uranium and containing only 0.1 {mu}g.kg{sup -1} of uranium, exhibit a {sup 235} U: {sup 238} U isotopic abundance ratio of 0.0092{+-}0.0002, while solutions originated from samples collected at places located more than 200 km far from activities related to the nuclear fuel cycle exhibit a value of 0.0074{+-}0.0002 for this abundance ratio. Similar results were obtained for samples collected in different places permit to confirm the presence of anthropogenic uranium and demonstrate the viability of using

  12. Using EMMA and MIX analysis to assess mixing ratios and to identify hydrochemical reactions in groundwater.

    Science.gov (United States)

    Tubau, Isabel; Vàzquez-Suñé, Enric; Jurado, Anna; Carrera, Jesús

    2014-02-01

    This study presents a methodology using an end-member mixing analysis (EMMA) and MIX to compute mixing ratios and to identify hydrochemical reactions in groundwater. The methodology consists of (1) identifying the potential sources of recharge, (2) characterising recharge sources and mixed water samples using hydrogeochemistry, (3) selecting chemical species to be used in the analysis and (4) calculating mixing ratios and identification of hydrochemical reactions in groundwater. This approach has been applied in the Besòs River Delta area, where we have collected 51 groundwater samples and a long data register of the hydrogeochemistry of the Besòs River created by the Catalan Water Agency is also available. The EMMA performed in the Besòs River suggests that 3 end-members are required to explain its temporal variability, accounting for the species chloride, sulphate, sodium, bicarbonate, calcium, magnesium, potassium, ammonium, total nitrogen, and electrical conductivity. One river end-member is from the wet periods (W1), and two are from dry periods (D1 and D2). These end-members have been used to compute mixing ratios in groundwater samples because the Besòs River is considered the main recharge source for the aquifer. Overall, dry season end-members dominated over the wet season end-member, in a proportion of 4:1. Moreover, when departures from the mixing line exist, geochemical processes might be identified. Redox processes, carbonate dissolution/precipitation and ion exchange processes may occur in Besòs Delta aquifer. PMID:24246935

  13. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  14. Analysis of neutron-reflectometry data by Monte Carlo technique

    CERN Document Server

    Singh, S

    2002-01-01

    Neutron-reflectometry data is collected in momentum space. The real-space information is extracted by fitting a model for the structure of a thin-film sample. We have attempted a Monte Carlo technique to extract the structure of the thin film. In this technique we change the structural parameters of the thin film by simulated annealing based on the Metropolis algorithm. (orig.)

  15. Data Mining Techniques: A Source for Consumer Behavior Analysis

    OpenAIRE

    Abhijit Raorane; R.V. Kulkarni

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply...

  16. Platelet-Related Variants Identified by Exomechip Meta-analysis in 157,293 Individuals.

    Science.gov (United States)

    Eicher, John D; Chami, Nathalie; Kacprowski, Tim; Nomura, Akihiro; Chen, Ming-Huei; Yanek, Lisa R; Tajuddin, Salman M; Schick, Ursula M; Slater, Andrew J; Pankratz, Nathan; Polfus, Linda; Schurmann, Claudia; Giri, Ayush; Brody, Jennifer A; Lange, Leslie A; Manichaikul, Ani; Hill, W David; Pazoki, Raha; Elliot, Paul; Evangelou, Evangelos; Tzoulaki, Ioanna; Gao, He; Vergnaud, Anne-Claire; Mathias, Rasika A; Becker, Diane M; Becker, Lewis C; Burt, Amber; Crosslin, David R; Lyytikäinen, Leo-Pekka; Nikus, Kjell; Hernesniemi, Jussi; Kähönen, Mika; Raitoharju, Emma; Mononen, Nina; Raitakari, Olli T; Lehtimäki, Terho; Cushman, Mary; Zakai, Neil A; Nickerson, Deborah A; Raffield, Laura M; Quarells, Rakale; Willer, Cristen J; Peloso, Gina M; Abecasis, Goncalo R; Liu, Dajiang J; Deloukas, Panos; Samani, Nilesh J; Schunkert, Heribert; Erdmann, Jeanette; Fornage, Myriam; Richard, Melissa; Tardif, Jean-Claude; Rioux, John D; Dube, Marie-Pierre; de Denus, Simon; Lu, Yingchang; Bottinger, Erwin P; Loos, Ruth J F; Smith, Albert Vernon; Harris, Tamara B; Launer, Lenore J; Gudnason, Vilmundur; Velez Edwards, Digna R; Torstenson, Eric S; Liu, Yongmei; Tracy, Russell P; Rotter, Jerome I; Rich, Stephen S; Highland, Heather M; Boerwinkle, Eric; Li, Jin; Lange, Ethan; Wilson, James G; Mihailov, Evelin; Mägi, Reedik; Hirschhorn, Joel; Metspalu, Andres; Esko, Tõnu; Vacchi-Suzzi, Caterina; Nalls, Mike A; Zonderman, Alan B; Evans, Michele K; Engström, Gunnar; Orho-Melander, Marju; Melander, Olle; O'Donoghue, Michelle L; Waterworth, Dawn M; Wallentin, Lars; White, Harvey D; Floyd, James S; Bartz, Traci M; Rice, Kenneth M; Psaty, Bruce M; Starr, J M; Liewald, David C M; Hayward, Caroline; Deary, Ian J; Greinacher, Andreas; Völker, Uwe; Thiele, Thomas; Völzke, Henry; van Rooij, Frank J A; Uitterlinden, André G; Franco, Oscar H; Dehghan, Abbas; Edwards, Todd L; Ganesh, Santhi K; Kathiresan, Sekar; Faraday, Nauder; Auer, Paul L; Reiner, Alex P; Lettre, Guillaume; Johnson, Andrew D

    2016-07-01

    Platelet production, maintenance, and clearance are tightly controlled processes indicative of platelets' important roles in hemostasis and thrombosis. Platelets are common targets for primary and secondary prevention of several conditions. They are monitored clinically by complete blood counts, specifically with measurements of platelet count (PLT) and mean platelet volume (MPV). Identifying genetic effects on PLT and MPV can provide mechanistic insights into platelet biology and their role in disease. Therefore, we formed the Blood Cell Consortium (BCX) to perform a large-scale meta-analysis of Exomechip association results for PLT and MPV in 157,293 and 57,617 individuals, respectively. Using the low-frequency/rare coding variant-enriched Exomechip genotyping array, we sought to identify genetic variants associated with PLT and MPV. In addition to confirming 47 known PLT and 20 known MPV associations, we identified 32 PLT and 18 MPV associations not previously observed in the literature across the allele frequency spectrum, including rare large effect (FCER1A), low-frequency (IQGAP2, MAP1A, LY75), and common (ZMIZ2, SMG6, PEAR1, ARFGAP3/PACSIN2) variants. Several variants associated with PLT/MPV (PEAR1, MRVI1, PTGES3) were also associated with platelet reactivity. In concurrent BCX analyses, there was overlap of platelet-associated variants with red (MAP1A, TMPRSS6, ZMIZ2) and white (PEAR1, ZMIZ2, LY75) blood cell traits, suggesting common regulatory pathways with shared genetic architecture among these hematopoietic lineages. Our large-scale Exomechip analyses identified previously undocumented associations with platelet traits and further indicate that several complex quantitative hematological, lipid, and cardiovascular traits share genetic factors.

  17. Transcriptomic analysis using olive varieties and breeding progenies identify candidate genes involved in plant architecture

    Directory of Open Access Journals (Sweden)

    Juan José eGonzález Plaza

    2016-03-01

    Full Text Available Plant architecture is a critical trait in fruit crops that can significantly influence yield, pruning, planting density and harvesting. Little is known about how plant architecture is genetically determined in olive, were most of the existing varieties are traditional with an architecture poorly suited for modern growing and harvesting systems. In the present study, we have carried out microarray analysis of meristematic tissue to compare expression profiles of olive varieties displaying differences in architecture, as well as seedlings from their cross pooled on the basis of their sharing architecture-related phenotypes. The microarray used, previously developed by our group has already been applied to identify candidates genes involved in regulating juvenile to adult transition in the shoot apex of seedlings. Varieties with distinct architecture phenotypes and individuals from segregating progenies displaying opposite architecture features were used to link phenotype to expression. Here, we identify 2,252 differentially expressed genes associated to differences in plant architecture. Microarray results were validated by quantitative RT-PCR carried out on genes with functional annotation likely related to plant architecture. Twelve of these genes were further analyzed in individual seedlings of the corresponding pool. We also examined Arabidopsis mutants in putative orthologs of these targeted candidate genes, finding altered architecture for most of them. This supports a functional conservation between species and potential biological relevance of the candidate genes identified. This study is the first to identify genes associated to plant architecture in olive, and the results obtained could be of great help in future programs aimed at selecting phenotypes adapted to modern cultivation practices in this species.

  18. Transcriptomic Analysis Using Olive Varieties and Breeding Progenies Identifies Candidate Genes Involved in Plant Architecture.

    Science.gov (United States)

    González-Plaza, Juan J; Ortiz-Martín, Inmaculada; Muñoz-Mérida, Antonio; García-López, Carmen; Sánchez-Sevilla, José F; Luque, Francisco; Trelles, Oswaldo; Bejarano, Eduardo R; De La Rosa, Raúl; Valpuesta, Victoriano; Beuzón, Carmen R

    2016-01-01

    Plant architecture is a critical trait in fruit crops that can significantly influence yield, pruning, planting density and harvesting. Little is known about how plant architecture is genetically determined in olive, were most of the existing varieties are traditional with an architecture poorly suited for modern growing and harvesting systems. In the present study, we have carried out microarray analysis of meristematic tissue to compare expression profiles of olive varieties displaying differences in architecture, as well as seedlings from their cross pooled on the basis of their sharing architecture-related phenotypes. The microarray used, previously developed by our group has already been applied to identify candidates genes involved in regulating juvenile to adult transition in the shoot apex of seedlings. Varieties with distinct architecture phenotypes and individuals from segregating progenies displaying opposite architecture features were used to link phenotype to expression. Here, we identify 2252 differentially expressed genes (DEGs) associated to differences in plant architecture. Microarray results were validated by quantitative RT-PCR carried out on genes with functional annotation likely related to plant architecture. Twelve of these genes were further analyzed in individual seedlings of the corresponding pool. We also examined Arabidopsis mutants in putative orthologs of these targeted candidate genes, finding altered architecture for most of them. This supports a functional conservation between species and potential biological relevance of the candidate genes identified. This study is the first to identify genes associated to plant architecture in olive, and the results obtained could be of great help in future programs aimed at selecting phenotypes adapted to modern cultivation practices in this species.

  19. In-depth analysis of low abundant proteins in bovine colostrum using different fractionation techniques.

    Science.gov (United States)

    Nissen, Asger; Bendixen, Emøke; Ingvartsen, Klaus Lønne; Røntved, Christine Maria

    2012-09-01

    Bovine colostrum is well known for its large content of bioactive components and its importance for neonatal survival. Unfortunately, the colostrum proteome is complicated by a wide dynamic range, because of a few dominating proteins that hamper sensitivity and proteome coverage achieved on low abundant proteins. Moreover, the composition of colostrum is complex and the proteins are located within different physical fractions that make up the colostrum. To gain a more exhaustive picture of the bovine colostrum proteome and gather information on protein location, we performed an extensive pre-analysis fractionation of colostrum prior to 2D-LC-MS/MS analysis. Physical and chemical properties of the proteins and colostrum were used alone or in combination for the separation of proteins. ELISA was used to quantify and verify the presence of proteins in colostrum. In total, 403 proteins were identified in the nonfractionated colostrum (NF) and seven fractions (F1-F7) using six different fractionation techniques. Fractionation contributed with 69 additional proteins in the fluid phase compared with NF. Different fractionation techniques each resulted in detection of unique subsets of proteins. Whey production by high-speed centrifugation contributed most to detection of low abundant proteins. Hence, prefractionation of colostrum prior to 2D-LC-MS/MS analysis expanded our knowledge on the presence and location of low abundant proteins in bovine colostrum. PMID:22848049

  20. Reticle defect sizing of optical proximity correction defects using SEM imaging and image analysis techniques

    Science.gov (United States)

    Zurbrick, Larry S.; Wang, Lantian; Konicek, Paul; Laird, Ellen R.

    2000-07-01

    Sizing of programmed defects on optical proximity correction (OPC) feature sis addressed using high resolution scanning electron microscope (SEM) images and image analysis techniques. A comparison and analysis of different sizing methods is made. This paper addresses the issues of OPC defect definition and discusses the experimental measurement results obtained by SEM in combination with image analysis techniques.

  1. Co-expression Analysis Identifies CRC and AP1 the Regulator of Arabidopsis Fatty Acid Biosynthesis

    Institute of Scientific and Technical Information of China (English)

    Xinxin Han; Linlin Yin; Hongwei Xue

    2012-01-01

    Fatty acids (FAs) play crucial rules in signal transduction and plant development,however,the regulation of FA metabolism is still poorly understood.To study the relevant regulatory network,fifty-eight FA biosynthesis genes including de novo synthases,desaturases and elongases were selected as "guide genes" to construct the co-expression network.Calculation of the correlation between all Arabidopsis thaliana (L.) genes with each guide gene by Arabidopsis co-expression dating mining tools (ACT)identifies 797 candidate FA-correlated genes.Gene ontology (GO) analysis of these co-expressed genes showed they are tightly correlated to photosynthesis and carbohydrate metabolism,and function in many processes.Interestingly,63 transcription factors (TFs) were identified as candidate FA biosynthesis regulators and 8 TF families are enriched.Two TF genes,CRC and AP1,both correlating with 8 FA guide genes,were further characterized.Analyses of the ap1 and crc mutant showed the altered total FA composition of mature seeds.The contents of palmitoleic acid,stearic acid,arachidic acid and eicosadienoic acid are decreased,whereas that of oleic acid is increased in ap1 and crc seeds,which is consistent with the qRT-PCR analysis revealing the suppressed expression of the corresponding guide genes.In addition,yeast one-hybrid analysis and electrophoretic mobility shift assay (EMSA) revealed that CRC can bind to the promoter regions of KCS7 and KCS15,indicating that CRC may directly regulate FA biosynthesis.

  2. Identifying sleep apnea syndrome using heart rate and breathing effort variation analysis based on ballistocardiography.

    Science.gov (United States)

    Weichao Zhao; Hongbo Ni; Xingshe Zhou; Yalong Song; Tianben Wang

    2015-08-01

    Sleep apnea syndrome (SAS) is regarded as one of the most common sleep-related breathing disorders, which can severely affect sleep quality. Since SAS is usually accompanied with the cyclical heart rate variation (HRV), many studies have been conducted on heart rate (HR) to identify it at an earlier stage. While most related work mainly based on clinical devices or signals (e.g., polysomnography (PSG), electrocardiography (ECG)), in this paper we focus on the ballistocardiographic (BCG) signal which is obtained in a non-invasive way. Moreover, as the precision and reliability of BCG signal are not so good as PSG or ECG, we propose a fine-grained feature extraction and analysis approach in SAS recognition. Our analysis takes both the basic HRV features and the breathing effort variation into consideration during different sleep stages rather than the whole night. The breathing effort refers to the mechanical interaction between respiration and BCG signal when SAS events occur, which is independent from autonomous nervous system (ANS) modulations. Specifically, a novel method named STC-Min is presented to extract the breathing effort variation feature. The basic HRV features depict the ANS modulations on HR and Sample Entropy and Detrended Fluctuation Analysis are applied for the evaluations. All the extracted features along with personal factors are fed into the knowledge-based support vector machine (KSVM) classification model, and the prior knowledge is based on dataset distribution and domain knowledge. Experimental results on 42 subjects in 3 nights validate the effectiveness of the methods and features in identifying SAS (90.46% precision rate and 88.89% recall rate). PMID:26737303

  3. Analysis of activity in open-source communities using social network analysis techniques

    OpenAIRE

    Martínez Torres, María del Rocío

    2014-01-01

    The success of an open-source software project is closely linked to the successful organization and development of the underlying virtual community. In particular, participation is the most important mechanism by which the development of the project is supported. The main objective of this paper is to analyse the online participation in virtual communities using social network analysis techniques in order to obtain the main patterns of behaviour of users within communities. Sev...

  4. Genome-wide analysis of over 106 000 individuals identifies 9 neuroticism-associated loci.

    Science.gov (United States)

    Smith, D J; Escott-Price, V; Davies, G; Bailey, M E S; Colodro-Conde, L; Ward, J; Vedernikov, A; Marioni, R; Cullen, B; Lyall, D; Hagenaars, S P; Liewald, D C M; Luciano, M; Gale, C R; Ritchie, S J; Hayward, C; Nicholl, B; Bulik-Sullivan, B; Adams, M; Couvy-Duchesne, B; Graham, N; Mackay, D; Evans, J; Smith, B H; Porteous, D J; Medland, S E; Martin, N G; Holmans, P; McIntosh, A M; Pell, J P; Deary, I J; O'Donovan, M C

    2016-06-01

    Neuroticism is a personality trait of fundamental importance for psychological well-being and public health. It is strongly associated with major depressive disorder (MDD) and several other psychiatric conditions. Although neuroticism is heritable, attempts to identify the alleles involved in previous studies have been limited by relatively small sample sizes. Here we report a combined meta-analysis of genome-wide association study (GWAS) of neuroticism that includes 91 370 participants from the UK Biobank cohort, 6659 participants from the Generation Scotland: Scottish Family Health Study (GS:SFHS) and 8687 participants from a QIMR (Queensland Institute of Medical Research) Berghofer Medical Research Institute (QIMR) cohort. All participants were assessed using the same neuroticism instrument, the Eysenck Personality Questionnaire-Revised (EPQ-R-S) Short Form's Neuroticism scale. We found a single-nucleotide polymorphism-based heritability estimate for neuroticism of ∼15% (s.e.=0.7%). Meta-analysis identified nine novel loci associated with neuroticism. The strongest evidence for association was at a locus on chromosome 8 (P=1.5 × 10(-15)) spanning 4 Mb and containing at least 36 genes. Other associated loci included interesting candidate genes on chromosome 1 (GRIK3 (glutamate receptor ionotropic kainate 3)), chromosome 4 (KLHL2 (Kelch-like protein 2)), chromosome 17 (CRHR1 (corticotropin-releasing hormone receptor 1) and MAPT (microtubule-associated protein Tau)) and on chromosome 18 (CELF4 (CUGBP elav-like family member 4)). We found no evidence for genetic differences in the common allelic architecture of neuroticism by sex. By comparing our findings with those of the Psychiatric Genetics Consortia, we identified a strong genetic correlation between neuroticism and MDD and a less strong but significant genetic correlation with schizophrenia, although not with bipolar disorder. Polygenic risk scores derived from the primary UK Biobank sample captured

  5. Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis

    Science.gov (United States)

    Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter

    2013-04-01

    Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at

  6. Supervised accelerometry analysis can identify prey capture by penguins at sea.

    Science.gov (United States)

    Carroll, Gemma; Slip, David; Jonsen, Ian; Harcourt, Rob

    2014-12-15

    Determining where, when and how much animals eat is fundamental to understanding their ecology. We developed a technique to identify a prey capture signature for little penguins from accelerometry, in order to quantify food intake remotely. We categorised behaviour of captive penguins from HD video and matched this to time-series data from back-mounted accelerometers. We then trained a support vector machine (SVM) to classify the penguins' behaviour at 0.3 s intervals as either 'prey handling' or 'swimming'. We applied this model to accelerometer data collected from foraging wild penguins to identify prey capture events. We compared prey capture and non-prey capture dives to test the model predictions against foraging theory. The SVM had an accuracy of 84.95±0.26% (mean ± s.e.) and a false positive rate of 9.82±0.24% when tested on unseen captive data. For wild data, we defined three independent, consecutive prey handling observations as representing true prey capture, with a false positive rate of 0.09%. Dives with prey captures had longer duration and bottom times, were deeper, had faster ascent rates, and had more 'wiggles' and 'dashes' (proxies for prey encounter used in other studies). The mean (±s.e.) number of prey captures per foraging trip was 446.6±66.28. By recording the behaviour of captive animals on HD video and using a supervised machine learning approach, we show that accelerometry signatures can classify the behaviour of wild animals at unprecedentedly fine scales.

  7. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  8. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  9. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Directory of Open Access Journals (Sweden)

    Ahmad Karim

    Full Text Available Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS, disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  10. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523

  11. Appropriate chicken sample size for identifying the composition of broiler intestinal microbiota affected by dietary antibiotics, using the polymerase chain reaction-denaturing gradient gel electrophoresis technique.

    Science.gov (United States)

    Zhou, H; Gong, J; Brisbin, J T; Yu, H; Sanei, B; Sabour, P; Sharif, S

    2007-12-01

    The bacterial microbiota in the broiler gastrointestinal tract are crucial for chicken health and growth. Their composition can vary among individual birds. To evaluate the composition of chicken microbiota in response to environmental disruption accurately, 4 different pools made up of 2, 5, 10, and 15 individuals were used to determine how many individuals in each pool were required to assess the degree of variation when using the PCR-denaturing gradient gel electrophoresis (DGGE) profiling technique. The correlation coefficients among 3 replicates within each pool group indicated that the optimal sample size for comparing PCR-DGGE bacterial profiles and downstream applications (such as identifying treatment effects) was 5 birds per pool for cecal microbiota. Subsequently, digesta from 5 birds was pooled to investigate the effects on the microbiota composition of the 2 most commonly used dietary antibiotics (virginiamycin and bacitracin methylene disalicylate) at 2 different doses by using PCR-DGGE, DNA sequencing, and quantitative PCR techniques. Thirteen DGGE DNA bands were identified, representing bacterial groups that had been affected by the antibiotics. Nine of them were validated. The effect of dietary antibiotics on the microbiota composition appeared to be dose and age dependent. These findings provide a working model for elucidating the mechanisms of antibiotic effects on the chicken intestinal microbiota and for developing alternatives to dietary antibiotics. PMID:18029800

  12. COMPARISON AND ANALYSIS OF VARIOUS HISTOGRAM EQUALIZATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    MADKI.M.R

    2012-04-01

    Full Text Available The intensity histogram gives information which can be used for contrast enhancement. The histogram equalization could be flat for levels less than the total number of levels. This could deteriorate the image. This problem can be overcome various techniques. This paper gives a comparative of the Bi-Histogram Equalization, Recursive Mean Seperated Histogram Equalization, Multipeak Histogram Equalization and Brightness Preserving Dynamic Histogram Equalization techniques by using these techniques to test few standard images. The method of bi-histogram uses independent histogram over two separate subimages. The method of recursive uses several subimages. Multipeak Histogram Equalization detects the peaks in the histogram and the subimages are formed based on the number detected. The Brightness Preserving Dynamic Histogram Equalization improves contrast while it maintains the brightness of the image. We shall compare the results through the metric parameters of absolute mean Brightness error and peak signal to noise ratio.

  13. Experimental Analysis of Small Scale PCB Manufacturing Techniques for Fablabs

    Directory of Open Access Journals (Sweden)

    Yannick Verbelen

    2013-04-01

    Full Text Available In this paper we present a complete modular PCB manufacturing process on fablab scale that is compliant with current PCB manufacturing standards. This includes, but is not limited to, a minimum track width of 8 mil, a minimum clearance of 6 mil, plated and non plated holes, a solder resist, surface finish and component overlay. We modularize industrial manufacturing processes and discuss advantages and disadvantages of production techniques for every phase. We then proceed to discuss the relevance and added value of every phase in the manufacturing process and their usefulness in a fablab context. Production techniques are evaluated regarding complexity, overhead, safety, required time, and environmental concerns. To ensure practical feasibility of the presented techniques, the manufacturing process is benchmarked in FablabXL and aims to be a practical reference for implementing or extending PCB manufacturing activities in fablabs.

  14. Application of Spectral Techniques in the Analysis of Multi-sensor AOD Data

    Science.gov (United States)

    Li, J.; Carlson, B. E.; Lacis, A. A.

    2013-12-01

    With the abundance of satellite observations of aerosol properties, the reliability of satellite data in representing global and regional aerosol variability, especially those related to major aerosol types and physical processes, becomes an essential question. Different data sets do not always agree due to differences in their measurement and retrieval strategies. Ground measurements such as AERONET are frequently used to validate satellite data. However, the sparse sampling of station measurements introduces representation errors when being compared with satellite data. Our study aims at assessing the performance of different satellite data sets in representing aerosol variability using spectral decomposition techniques. These techniques reduce data dimension and allow the examination of both spatial and temporal variability focusing on specific aerosol source regions and events. Specifically, Combined Principal Component Analysis (CPCA) is performed on Aqua MODIS, MISR, SeaWiFS and OMI AOD products to compare their common modes of variability both qualitatively and quantitatively. Moreover, we introduce Maximum Covariance Analysis (MCA) as an effective way to compare correlated spatial and temporal patterns between satellite measurements and AERONET data. This method well accommodates measurements at isolated locations for which CPCA is not applicable. Finally, we use a novel approach by combining the two techniques, i.e., MCA analysis of the combined satellite field and AERONET. By relating all available measurements, this analysis verifies and confirms observed aerosol physical processes, better assesses each data set against AERONET, and identifies regions with larger discrepancy. Our results suggest that all four data sets agree reasonably well with AERONET in capturing globally dominant aerosol variability, including dust, biomass burning and urban industrial aerosol regimes. MISR appears to agree best with AERONET despite its narrow swath and low sampling

  15. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  16. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  17. Current trends in nuclear borehole logging techniques for elemental analysis

    International Nuclear Information System (INIS)

    This report is the result of a consultants' meeting organized by the IAEA and held in Ottawa, Canada, 2-6 November 1987 in order to assess the present technical status of nuclear borehole logging techniques, to find out the well established applications and the development trends. It contains a summary report giving a comprehensive overview of the techniques and applications and a collection of research papers describing work done in industrial institutes. A separate abstract was prepared for each of these 9 papers. Refs, figs and tabs

  18. DATA MINING TECHNIQUES: A SOURCE FOR CONSUMER BEHAVIOR ANALYSIS

    Directory of Open Access Journals (Sweden)

    Abhijit Raorane

    2011-09-01

    Full Text Available Various studies on consumer purchasing behaviors have been presented and used in real problems. Datamining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, thedata mining method has disadvantages as well as advantages.Therefore, it is important to selectappropriate techniques to mine databases. The objective of this paper is to know consumer behavior, hispsychological condition at the time of purchase and how suitable data mining method apply to improveconventional method. Moreover, in an experiment, association rule is employed to mine rules for trustedcustomers using sales data in a super market industry

  19. [An analysis of key points for root canal therapy technique].

    Science.gov (United States)

    Fan, M W

    2016-08-01

    The success rate of root canal therapy(RCT)have been improved continuously along with the advancement in RCT techniques in the past several decades. If standard procedures of modern RCT techniques are strictly followed, the success rate of RCT may exceed 90%. The success of RCT is mainly affected by such factors as clear concept of the anatomy of root canals, proper mechanical and chemical preparation and perfect filling of root canal system. If these factors are sufficiently noted, a success is easy to achieve. Even though the primary RCT fails, retreatment can further be conducted to save the diseased teeth. PMID:27511032

  20. Pathway Analysis Incorporating Protein-Protein Interaction Networks Identified Candidate Pathways for the Seven Common Diseases.

    Science.gov (United States)

    Lin, Peng-Lin; Yu, Ya-Wen; Chung, Ren-Hua

    2016-01-01

    Pathway analysis has become popular as a secondary analysis strategy for genome-wide association studies (GWAS). Most of the current pathway analysis methods aggregate signals from the main effects of single nucleotide polymorphisms (SNPs) in genes within a pathway without considering the effects of gene-gene interactions. However, gene-gene interactions can also have critical effects on complex diseases. Protein-protein interaction (PPI) networks have been used to define gene pairs for the gene-gene interaction tests. Incorporating the PPI information to define gene pairs for interaction tests within pathways can increase the power for pathway-based association tests. We propose a pathway association test, which aggregates the interaction signals in PPI networks within a pathway, for GWAS with case-control samples. Gene size is properly considered in the test so that genes do not contribute more to the test statistic simply due to their size. Simulation studies were performed to verify that the method is a valid test and can have more power than other pathway association tests in the presence of gene-gene interactions within a pathway under different scenarios. We applied the test to the Wellcome Trust Case Control Consortium GWAS datasets for seven common diseases. The most significant pathway is the chaperones modulate interferon signaling pathway for Crohn's disease (p-value = 0.0003). The pathway modulates interferon gamma, which induces the JAK/STAT pathway that is involved in Crohn's disease. Several other pathways that have functional implications for the seven diseases were also identified. The proposed test based on gene-gene interaction signals in PPI networks can be used as a complementary tool to the current existing pathway analysis methods focusing on main effects of genes. An efficient software implementing the method is freely available at http://puppi.sourceforge.net. PMID:27622767

  1. Shared Genetic Etiology between Type 2 Diabetes and Alzheimer's Disease Identified by Bioinformatics Analysis.

    Science.gov (United States)

    Gao, Lei; Cui, Zhen; Shen, Liang; Ji, Hong-Fang

    2015-01-01

    Type 2 diabetes (T2D) and Alzheimer's disease (AD) are two major health issues, and increasing evidence in recent years supports the close connection between these two diseases. The present study aimed to explore the shared genetic etiology underlying T2D and AD based on the available genome wide association studies (GWAS) data collected through August 2014. We performed bioinformatics analyses based on GWAS data of T2D and AD on single nucleotide polymorphisms (SNPs), gene, and pathway levels, respectively. Six SNPs (rs111789331, rs12721046, rs12721051, rs4420638, rs56131196, and rs66626994) were identified for the first time to be shared genetic factors between T2D and AD. Further functional enrichment analysis found lipid metabolism related pathways to be common between these two disorders. The findings may have important implications for future mechanistic and interventional studies for T2D and AD. PMID:26639962

  2. Identifying time measurement tampering in the traversal time and hop count analysis (TTHCA) wormhole detection algorithm.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2013-01-01

    Traversal time and hop count analysis (TTHCA) is a recent wormhole detection algorithm for mobile ad hoc networks (MANET) which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM) wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm. PMID:23686143

  3. Identifying Time Measurement Tampering in the Traversal Time and Hop Count Analysis (TTHCA Wormhole Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Jonny Karlsson

    2013-05-01

    Full Text Available Traversal time and hop count analysis (TTHCA is a recent wormhole detection algorithm for mobile ad hoc networks (MANET which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm.

  4. Identifying and Analysis of Scene Mining Methods Beased on Scenes Extracted Features

    CERN Document Server

    Jabari, Ashraf Sadat

    2012-01-01

    Scene mining is a subset of image mining in which scenes are classified to a distinct set of classes based on analysis of their content. In other word in scene mining, a label is given to visual content of scene, for example, mountain, beach. Scene mining is used in applications such as medicine, movie, information retrieval, computer vision, recognition of traffic scene. Reviewing of represented methods shows there are various methods in scene mining. Scene mining applications extension and existence of various scenes, make comparison of methods hard. Scene mining can be followed by identifying scene mining components and representing a framework to analyzing and evaluating methods. In this paper, at first, components of scene mining are introduced, then a framework based on extracted features of scene is represented to classify scene mining methods. Finally, these methods are analyzed and evaluated via a proposal framework.

  5. Using Principal Component Analysis to Identify Priority Neighbourhoods for Health Services Delivery by Ranking Socioeconomic Status

    Science.gov (United States)

    Friesen, Christine Elizabeth; Seliske, Patrick; Papadopoulos, Andrew

    2016-01-01

    Objectives. Socioeconomic status (SES) is a comprehensive indicator of health status and is useful in area-level health research and informing public health resource allocation. Principal component analysis (PCA) is a useful tool for developing SES indices to identify area-level disparities in SES within communities. While SES research in Canada has relied on census data, the voluntary nature of the 2011 National Household Survey challenges the validity of its data, especially income variables. This study sought to determine the appropriateness of replacing census income information with tax filer data in neighbourhood SES index development. Methods. Census and taxfiler data for Guelph, Ontario were retrieved for the years 2005, 2006, and 2011. Data were extracted for eleven income and non-income SES variables. PCA was employed to identify significant principal components from each dataset and weights of each contributing variable. Variable-specific factor scores were applied to standardized census and taxfiler data values to produce SES scores. Results. The substitution of taxfiler income variables for census income variables yielded SES score distributions and neighbourhood SES classifications that were similar to SES scores calculated using entirely census variables. Combining taxfiler income variables with census non-income variables also produced clearer SES level distinctions. Internal validation procedures indicated that utilizing multiple principal components produced clearer SES level distinctions than using only the first principal component. Conclusion. Identifying socioeconomic disparities between neighbourhoods is an important step in assessing the level of disadvantage of communities. The ability to replace census income information with taxfiler data to develop SES indices expands the versatility of public health research and planning in Canada, as more data sources can be explored. The apparent usefulness of PCA also contributes to the improvement

  6. Identifying the potential loss of monitoring wells using an uncertainty analysis.

    Science.gov (United States)

    Freedman, Vicky L; Waichler, Scott R; Cole, Charles R; Vermeul, Vince R; Bergeron, Marcel P

    2005-01-01

    From the mid-1940s through the 1980s, large volumes of waste water were discharged at the Hanford Site in southeastern Washington State, causing a large-scale rise (>20 m) in the water table. When waste water discharges ceased in 1988, ground water mounds began to dissipate. This caused a large number of wells to go dry and has made it difficult to monitor contaminant plume migration. To identify monitoring wells that will need replacement, a methodology has been developed using a first-order uncertainty analysis with UCODE, a nonlinear parameter estimation code. Using a three-dimensional, finite-element ground water flow code, key parameters were identified by calibrating to historical hydraulic head data. Results from the calibration period were then used to check model predictions by comparing monitoring wells' wet/dry status with field data. This status was analyzed using a methodology that incorporated the 0.3 cumulative probability derived from the confidence and prediction intervals. For comparison, a nonphysically based trend model was also used as a predictor of wells' wet/dry status. Although the numerical model outperformed the trend model, for both models, the central value of the intervals was a better predictor of a wet well status. The prediction interval, however, was more successful at identifying dry wells. Predictions made through the year 2048 indicated that 46% of the wells in the monitoring well network are likely to go dry in areas near the river and where the ground water mound is dissipating. PMID:16324012

  7. Predictors of Extubation Failure in Neurocritical Patients Identified by a Systematic Review and Meta-Analysis

    Science.gov (United States)

    Huang, Kaibin; Lin, Zhenzhou; Qiao, Weiguang; Pan, Suyue

    2014-01-01

    Background Prediction of extubation failure, particularly in neurocritical patients, is unique and controversial. We conducted a systematic review and meta-analysis to identify the risk factors for extubation failure in these patients. Methods A literature search of databases (MEDLINE, EMBASE, the Cochrane Library, and Web of Science) was performed up to August of 2013 to identify trials that evaluated extubation failure predictors. Included trials were either prospective or retrospective cohort studies. Results Nine studies involving 928 participants were included. The systematic review and meta-analysis revealed that the following were predictive for extubation failure: pneumonia, atelectasis, mechanical ventilation of >24 h, a low Glasgow Coma Scale score (7–9T) (OR = 4.96, 95% CI = 1.61–15.26, P = 0.005), the inability to follow commands (OR = 2.07, 95% CI = 1.15–3.71, P = 0.02), especially the command to close the eyes, thick secretion, and no intact gag reflex. Meanwhile, the following were not predictive for extubation failure: sex, secretion volume, coughing upon suctioning, and the inability to follow one command among showing two fingers, wiggling the toes, or coughing on command. Additionally, some traditional weaning parameters were shown to poorly predict extubation failure in neurocritical patients. Conclusions Besides pneumonia, atelectasis, and the duration of mechanical ventilation, other factors that should be taken into consideration in the prediction of extubation failure when neurocritical patients are weaned from tracheal intubation include neurologic abilities (Glasgow Coma Scale score and following commands), the secretion texture, and the presence of a gag reflex. PMID:25486091

  8. Deep Proteome Analysis Identifies Age-Related Processes in C. elegans.

    Science.gov (United States)

    Narayan, Vikram; Ly, Tony; Pourkarimi, Ehsan; Murillo, Alejandro Brenes; Gartner, Anton; Lamond, Angus I; Kenyon, Cynthia

    2016-08-01

    Effective network analysis of protein data requires high-quality proteomic datasets. Here, we report a near doubling in coverage of the C. elegans adult proteome, identifying >11,000 proteins in total with ∼9,400 proteins reproducibly detected in three biological replicates. Using quantitative mass spectrometry, we identify proteins whose abundances vary with age, revealing a concerted downregulation of proteins involved in specific metabolic pathways and upregulation of cellular stress responses with advancing age. Among these are ∼30 peroxisomal proteins, including the PRX-5/PEX5 import protein. Functional experiments confirm that protein import into the peroxisome is compromised in vivo in old animals. We also studied the behavior of the set of age-variant proteins in chronologically age-matched, long-lived daf-2 insulin/IGF-1-pathway mutants. Unexpectedly, the levels of many of these age-variant proteins did not scale with extended lifespan. This indicates that, despite their youthful appearance and extended lifespans, not all aspects of aging are reset in these long-lived mutants. PMID:27453442

  9. Principal components analysis based methodology to identify differentially expressed genes in time-course microarray data

    Directory of Open Access Journals (Sweden)

    Srinivasan Rajagopalan

    2008-06-01

    Full Text Available Abstract Background Time-course microarray experiments are being increasingly used to characterize dynamic biological processes. In these experiments, the goal is to identify genes differentially expressed in time-course data, measured between different biological conditions. These differentially expressed genes can reveal the changes in biological process due to the change in condition which is essential to understand differences in dynamics. Results In this paper, we propose a novel method for finding differentially expressed genes in time-course data and across biological conditions (say C1 and C2. We model the expression at C1 using Principal Component Analysis and represent the expression profile of each gene as a linear combination of the dominant Principal Components (PCs. Then the expression data from C2 is projected on the developed PCA model and scores are extracted. The difference between the scores is evaluated using a hypothesis test to quantify the significance of differential expression. We evaluate the proposed method to understand differences in two case studies (1 the heat shock response of wild-type and HSF1 knockout mice, and (2 cell-cycle between wild-type and Fkh1/Fkh2 knockout Yeast strains. Conclusion In both cases, the proposed method identified biologically significant genes.

  10. Analysis of Pigeon (Columba Ovary Transcriptomes to Identify Genes Involved in Blue Light Regulation.

    Directory of Open Access Journals (Sweden)

    Ying Wang

    Full Text Available Monochromatic light is widely applied to promote poultry reproductive performance, yet little is currently known regarding the mechanism by which light wavelengths affect pigeon reproduction. Recently, high-throughput sequencing technologies have been used to provide genomic information for solving this problem. In this study, we employed Illumina Hiseq 2000 to identify differentially expressed genes in ovary tissue from pigeons under blue and white light conditions and de novo transcriptome assembly to construct a comprehensive sequence database containing information on the mechanisms of follicle development. A total of 157,774 unigenes (mean length: 790 bp were obtained by the Trinity program, and 35.83% of these unigenes were matched to genes in a non-redundant protein database. Gene description, gene ontology, and the clustering of orthologous group terms were performed to annotate the transcriptome assembly. Differentially expressed genes between blue and white light conditions included those related to oocyte maturation, hormone biosynthesis, and circadian rhythm. Furthermore, 17,574 SSRs and 533,887 potential SNPs were identified in this transcriptome assembly. This work is the first transcriptome analysis of the Columba ovary using Illumina technology, and the resulting transcriptome and differentially expressed gene data can facilitate further investigations into the molecular mechanism of the effect of blue light on follicle development and reproduction in pigeons and other bird species.

  11. Epigenome-Wide Association Analysis Identified Nine Skin DNA Methylation Loci for Psoriasis.

    Science.gov (United States)

    Zhou, Fusheng; Wang, Wenjun; Shen, Changbing; Li, Hui; Zuo, Xianbo; Zheng, Xiaodong; Yue, Min; Zhang, Cuicui; Yu, Liang; Chen, Mengyun; Zhu, Caihong; Yin, Xianyong; Tang, Mingjun; Li, Yongjiang; Chen, Gang; Wang, Zaixing; Liu, Shengxiu; Zhou, Yi; Zhang, Fengyu; Zhang, Weijia; Li, Caihua; Yang, Sen; Sun, Liangdan; Zhang, Xuejun

    2016-04-01

    Psoriasis is a chronic hyperproliferative and inflammatory skin disease caused by the interplay of genetic and environmental factors. DNA methylation has been linked to psoriasis, but the manner in which this process contributes to the disease is not fully understood. In this study, we carried out a three-stage epigenome-wide association study to identify disease-associated differentially methylated sites using a combination of 262 skin and 48 peripheral blood mononuclear cell samples. We not only revealed genome-wide methylation patterns for psoriasis but also identified strong associations between the skin-specific DNA methylation of nine disease-associated differentially methylated sites and psoriasis (Wilcoxon ranked PBonferroni 0.10). Further analysis revealed that these nine disease-associated differentially methylated sites were not significantly affected by genetic variations, supporting their remarkable contributions to disease status. The expression of CYP2S1, ECE1, EIF2C2, MAN1C1, and DLGAP4 was negatively correlated with DNA methylation. These findings will help us to better understand the molecular mechanism of psoriasis. PMID:26743604

  12. Epigenome-Wide Association Analysis Identified Nine Skin DNA Methylation Loci for Psoriasis.

    Science.gov (United States)

    Zhou, Fusheng; Wang, Wenjun; Shen, Changbing; Li, Hui; Zuo, Xianbo; Zheng, Xiaodong; Yue, Min; Zhang, Cuicui; Yu, Liang; Chen, Mengyun; Zhu, Caihong; Yin, Xianyong; Tang, Mingjun; Li, Yongjiang; Chen, Gang; Wang, Zaixing; Liu, Shengxiu; Zhou, Yi; Zhang, Fengyu; Zhang, Weijia; Li, Caihua; Yang, Sen; Sun, Liangdan; Zhang, Xuejun

    2016-04-01

    Psoriasis is a chronic hyperproliferative and inflammatory skin disease caused by the interplay of genetic and environmental factors. DNA methylation has been linked to psoriasis, but the manner in which this process contributes to the disease is not fully understood. In this study, we carried out a three-stage epigenome-wide association study to identify disease-associated differentially methylated sites using a combination of 262 skin and 48 peripheral blood mononuclear cell samples. We not only revealed genome-wide methylation patterns for psoriasis but also identified strong associations between the skin-specific DNA methylation of nine disease-associated differentially methylated sites and psoriasis (Wilcoxon ranked PBonferroni 0.10). Further analysis revealed that these nine disease-associated differentially methylated sites were not significantly affected by genetic variations, supporting their remarkable contributions to disease status. The expression of CYP2S1, ECE1, EIF2C2, MAN1C1, and DLGAP4 was negatively correlated with DNA methylation. These findings will help us to better understand the molecular mechanism of psoriasis.

  13. GWAS meta-analysis and replication identifies three new susceptibility loci for ovarian cancer.

    Science.gov (United States)

    Pharoah, Paul D P; Tsai, Ya-Yu; Ramus, Susan J; Phelan, Catherine M; Goode, Ellen L; Lawrenson, Kate; Buckley, Melissa; Fridley, Brooke L; Tyrer, Jonathan P; Shen, Howard; Weber, Rachel; Karevan, Rod; Larson, Melissa C; Song, Honglin; Tessier, Daniel C; Bacot, François; Vincent, Daniel; Cunningham, Julie M; Dennis, Joe; Dicks, Ed; Aben, Katja K; Anton-Culver, Hoda; Antonenkova, Natalia; Armasu, Sebastian M; Baglietto, Laura; Bandera, Elisa V; Beckmann, Matthias W; Birrer, Michael J; Bloom, Greg; Bogdanova, Natalia; Brenton, James D; Brinton, Louise A; Brooks-Wilson, Angela; Brown, Robert; Butzow, Ralf; Campbell, Ian; Carney, Michael E; Carvalho, Renato S; Chang-Claude, Jenny; Chen, Y Anne; Chen, Zhihua; Chow, Wong-Ho; Cicek, Mine S; Coetzee, Gerhard; Cook, Linda S; Cramer, Daniel W; Cybulski, Cezary; Dansonka-Mieszkowska, Agnieszka; Despierre, Evelyn; Doherty, Jennifer A; Dörk, Thilo; du Bois, Andreas; Dürst, Matthias; Eccles, Diana; Edwards, Robert; Ekici, Arif B; Fasching, Peter A; Fenstermacher, David; Flanagan, James; Gao, Yu-Tang; Garcia-Closas, Montserrat; Gentry-Maharaj, Aleksandra; Giles, Graham; Gjyshi, Anxhela; Gore, Martin; Gronwald, Jacek; Guo, Qi; Halle, Mari K; Harter, Philipp; Hein, Alexander; Heitz, Florian; Hillemanns, Peter; Hoatlin, Maureen; Høgdall, Estrid; Høgdall, Claus K; Hosono, Satoyo; Jakubowska, Anna; Jensen, Allan; Kalli, Kimberly R; Karlan, Beth Y; Kelemen, Linda E; Kiemeney, Lambertus A; Kjaer, Susanne Krüger; Konecny, Gottfried E; Krakstad, Camilla; Kupryjanczyk, Jolanta; Lambrechts, Diether; Lambrechts, Sandrina; Le, Nhu D; Lee, Nathan; Lee, Janet; Leminen, Arto; Lim, Boon Kiong; Lissowska, Jolanta; Lubiński, Jan; Lundvall, Lene; Lurie, Galina; Massuger, Leon F A G; Matsuo, Keitaro; McGuire, Valerie; McLaughlin, John R; Menon, Usha; Modugno, Francesmary; Moysich, Kirsten B; Nakanishi, Toru; Narod, Steven A; Ness, Roberta B; Nevanlinna, Heli; Nickels, Stefan; Noushmehr, Houtan; Odunsi, Kunle; Olson, Sara; Orlow, Irene; Paul, James; Pejovic, Tanja; Pelttari, Liisa M; Permuth-Wey, Jenny; Pike, Malcolm C; Poole, Elizabeth M; Qu, Xiaotao; Risch, Harvey A; Rodriguez-Rodriguez, Lorna; Rossing, Mary Anne; Rudolph, Anja; Runnebaum, Ingo; Rzepecka, Iwona K; Salvesen, Helga B; Schwaab, Ira; Severi, Gianluca; Shen, Hui; Shridhar, Vijayalakshmi; Shu, Xiao-Ou; Sieh, Weiva; Southey, Melissa C; Spellman, Paul; Tajima, Kazuo; Teo, Soo-Hwang; Terry, Kathryn L; Thompson, Pamela J; Timorek, Agnieszka; Tworoger, Shelley S; van Altena, Anne M; van den Berg, David; Vergote, Ignace; Vierkant, Robert A; Vitonis, Allison F; Wang-Gohrke, Shan; Wentzensen, Nicolas; Whittemore, Alice S; Wik, Elisabeth; Winterhoff, Boris; Woo, Yin Ling; Wu, Anna H; Yang, Hannah P; Zheng, Wei; Ziogas, Argyrios; Zulkifli, Famida; Goodman, Marc T; Hall, Per; Easton, Douglas F; Pearce, Celeste L; Berchuck, Andrew; Chenevix-Trench, Georgia; Iversen, Edwin; Monteiro, Alvaro N A; Gayther, Simon A; Schildkraut, Joellen M; Sellers, Thomas A

    2013-04-01

    Genome-wide association studies (GWAS) have identified four susceptibility loci for epithelial ovarian cancer (EOC), with another two suggestive loci reaching near genome-wide significance. We pooled data from a GWAS conducted in North America with another GWAS from the UK. We selected the top 24,551 SNPs for inclusion on the iCOGS custom genotyping array. We performed follow-up genotyping in 18,174 individuals with EOC (cases) and 26,134 controls from 43 studies from the Ovarian Cancer Association Consortium. We validated the two loci at 3q25 and 17q21 that were previously found to have associations close to genome-wide significance and identified three loci newly associated with risk: two loci associated with all EOC subtypes at 8q21 (rs11782652, P = 5.5 × 10(-9)) and 10p12 (rs1243180, P = 1.8 × 10(-8)) and another locus specific to the serous subtype at 17q12 (rs757210, P = 8.1 × 10(-10)). An integrated molecular analysis of genes and regulatory regions at these loci provided evidence for functional mechanisms underlying susceptibility and implicated CHMP4C in the pathogenesis of ovarian cancer. PMID:23535730

  14. GWAS meta-analysis and replication identifies three new susceptibility loci for ovarian cancer

    Science.gov (United States)

    Pharoah, Paul D. P.; Tsai, Ya-Yu; Ramus, Susan J.; Phelan, Catherine M.; Goode, Ellen L.; Lawrenson, Kate; Price, Melissa; Fridley, Brooke L.; Tyrer, Jonathan P.; Shen, Howard; Weber, Rachel; Karevan, Rod; Larson, Melissa C.; Song, Honglin; Tessier, Daniel C.; Bacot, François; Vincent, Daniel; Cunningham, Julie M.; Dennis, Joe; Dicks, Ed; Aben, Katja K.; Anton-Culver, Hoda; Antonenkova, Natalia; Armasu, Sebastian M.; Baglietto, Laura; Bandera, Elisa V.; Beckmann, Matthias W.; Birrer, Michael J.; Bloom, Greg; Bogdanova, Natalia; Brenton, James D.; Brinton, Louise A.; Brooks-Wilson, Angela; Brown, Robert; Butzow, Ralf; Campbell, Ian; Carney, Michael E; Carvalho, Renato S.; Chang-Claude, Jenny; Chen, Y. Anne; Chen, Zhihua; Chow, Wong-Ho; Cicek, Mine S.; Coetzee, Gerhard; Cook, Linda S.; Cramer, Daniel W.; Cybulski, Cezary; Dansonka-Mieszkowska, Agnieszka; Despierre, Evelyn; Doherty, Jennifer A; Dörk, Thilo; du Bois, Andreas; Dürst, Matthias; Eccles, Diana; Edwards, Robert; Ekici, Arif B.; Fasching, Peter A.; Fenstermacher, David; Flanagan, James; Gao, Yu-Tang; Garcia-Closas, Montserrat; Gentry-Maharaj, Aleksandra; Giles, Graham; Gjyshi, Anxhela; Gore, Martin; Gronwald, Jacek; Guo, Qi; Halle, Mari K; Harter, Philipp; Hein, Alexander; Heitz, Florian; Hillemanns, Peter; Hoatlin, Maureen; Høgdall, Estrid; Høgdall, Claus K.; Hosono, Satoyo; Jakubowska, Anna; Jensen, Allan; Kalli, Kimberly R.; Karlan, Beth Y.; Kelemen, Linda E.; Kiemeney, Lambertus A.; Kjaer, Susanne Krüger; Konecny, Gottfried E.; Krakstad, Camilla; Kupryjanczyk, Jolanta; Lambrechts, Diether; Lambrechts, Sandrina; Le, Nhu D.; Lee, Nathan; Lee, Janet; Leminen, Arto; Lim, Boon Kiong; Lissowska, Jolanta; Lubiński, Jan; Lundvall, Lene; Lurie, Galina; Massuger, Leon F.A.G.; Matsuo, Keitaro; McGuire, Valerie; McLaughlin, John R; Menon, Usha; Modugno, Francesmary; Moysich, Kirsten B.; Nakanishi, Toru; Narod, Steven A.; Ness, Roberta B.; Nevanlinna, Heli; Nickels, Stefan; Noushmehr, Houtan; Odunsi, Kunle; Olson, Sara; Orlow, Irene; Paul, James; Pejovic, Tanja; Pelttari, Liisa M; Permuth-Wey, Jenny; Pike, Malcolm C; Poole, Elizabeth M; Qu, Xiaotao; Risch, Harvey A.; Rodriguez-Rodriguez, Lorna; Rossing, Mary Anne; Rudolph, Anja; Runnebaum, Ingo; Rzepecka, Iwona K; Salvesen, Helga B.; Schwaab, Ira; Severi, Gianluca; Shen, Hui; Shridhar, Vijayalakshmi; Shu, Xiao-Ou; Sieh, Weiva; Southey, Melissa C.; Spellman, Paul; Tajima, Kazuo; Teo, Soo-Hwang; Terry, Kathryn L.; Thompson, Pamela J; Timorek, Agnieszka; Tworoger, Shelley S.; van Altena, Anne M.; Berg, David Van Den; Vergote, Ignace; Vierkant, Robert A.; Vitonis, Allison F.; Wang-Gohrke, Shan; Wentzensen, Nicolas; Whittemore, Alice S.; Wik, Elisabeth; Winterhoff, Boris; Woo, Yin Ling; Wu, Anna H; Yang, Hannah P.; Zheng, Wei; Ziogas, Argyrios; Zulkifli, Famida; Goodman, Marc T.; Hall, Per; Easton, Douglas F; Pearce, Celeste L; Berchuck, Andrew; Chenevix-Trench, Georgia; Iversen, Edwin; Monteiro, Alvaro N.A.; Gayther, Simon A.; Schildkraut, Joellen M.; Sellers, Thomas A.

    2013-01-01

    Genome wide association studies (GWAS) have identified four susceptibility loci for epithelial ovarian cancer (EOC) with another two loci being close to genome-wide significance. We pooled data from a GWAS conducted in North America with another GWAS from the United Kingdom. We selected the top 24,551 SNPs for inclusion on the iCOGS custom genotyping array. Follow-up genotyping was carried out in 18,174 cases and 26,134 controls from 43 studies from the Ovarian Cancer Association Consortium. We validated the two loci at 3q25 and 17q21 previously near genome-wide significance and identified three novel loci associated with risk; two loci associated with all EOC subtypes, at 8q21 (rs11782652, P=5.5×10-9) and 10p12 (rs1243180; P=1.8×10-8), and another locus specific to the serous subtype at 17q12 (rs757210; P=8.1×10-10). An integrated molecular analysis of genes and regulatory regions at these loci provided evidence for functional mechanisms underlying susceptibility that implicates CHMP4C in the pathogenesis of ovarian cancer. PMID:23535730

  15. Perturbation-expression analysis identifies RUNX1 as a regulator of human mammary stem cell differentiation.

    Directory of Open Access Journals (Sweden)

    Ethan S Sokol

    2015-04-01

    Full Text Available The search for genes that regulate stem cell self-renewal and differentiation has been hindered by a paucity of markers that uniquely label stem cells and early progenitors. To circumvent this difficulty we have developed a method that identifies cell-state regulators without requiring any markers of differentiation, termed Perturbation-Expression Analysis of Cell States (PEACS. We have applied this marker-free approach to screen for transcription factors that regulate mammary stem cell differentiation in a 3D model of tissue morphogenesis and identified RUNX1 as a stem cell regulator. Inhibition of RUNX1 expanded bipotent stem cells and blocked their differentiation into ductal and lobular tissue rudiments. Reactivation of RUNX1 allowed exit from the bipotent state and subsequent differentiation and mammary morphogenesis. Collectively, our findings show that RUNX1 is required for mammary stem cells to exit a bipotent state, and provide a new method for discovering cell-state regulators when markers are not available.

  16. Bridging the gap between sample collection and laboratory analysis: using dried blood spots to identify human exposure to chemical agents

    Science.gov (United States)

    Hamelin, Elizabeth I.; Blake, Thomas A.; Perez, Jonas W.; Crow, Brian S.; Shaner, Rebecca L.; Coleman, Rebecca M.; Johnson, Rudolph C.

    2016-05-01

    Public health response to large scale chemical emergencies presents logistical challenges for sample collection, transport, and analysis. Diagnostic methods used to identify and determine exposure to chemical warfare agents, toxins, and poisons traditionally involve blood collection by phlebotomists, cold transport of biomedical samples, and costly sample preparation techniques. Use of dried blood spots, which consist of dried blood on an FDA-approved substrate, can increase analyte stability, decrease infection hazard for those handling samples, greatly reduce the cost of shipping/storing samples by removing the need for refrigeration and cold chain transportation, and be self-prepared by potentially exposed individuals using a simple finger prick and blood spot compatible paper. Our laboratory has developed clinical assays to detect human exposures to nerve agents through the analysis of specific protein adducts and metabolites, for which a simple extraction from a dried blood spot is sufficient for removing matrix interferents and attaining sensitivities on par with traditional sampling methods. The use of dried blood spots can bridge the gap between the laboratory and the field allowing for large scale sample collection with minimal impact on hospital resources while maintaining sensitivity, specificity, traceability, and quality requirements for both clinical and forensic applications.

  17. Sixth Australian conference on nuclear techniques of analysis: proceedings

    International Nuclear Information System (INIS)

    These proceedings contain the abstracts of 77 lectures. The topics focus on instrumentation, nuclear techniques and their applications for material science, surfaces, archaeometry, art, geological, environmental and biomedical studies. An outline of the Australian facilities available for research purposes is also provided. Separate abstracts were prepared for the individual papers in this volume

  18. Analysis on Poe's Unique Techniques to Achieve Aestheticism

    Institute of Scientific and Technical Information of China (English)

    孔佳鸣

    2008-01-01

    Edgar Allan Poe was one of the most important poets in the American poetic history for his unremitting pursuit for ‘ideal beauty'.This essay proves by various examples chosen from his poems that his aestheticism was obvious in his versification techniques.His poetic theory and practice gave an immortal example for the development of the English poetry.

  19. Analysis of ISO 26262 Compliant Techniques for the Automotive Domain

    NARCIS (Netherlands)

    Kannan, M. S.; Dajsuren, Y.; Luo, Y.; Barosan, I.

    2015-01-01

    The ISO 26262 standard denes functional safety for automotive E/E systems. Since the publication of the rst edition of this standard in 2011, many dierent safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the standard are targ

  20. Integrative Functional Genomics Analysis of Sustained Polyploidy Phenotypes in Breast Cancer Cells Identifies an Oncogenic Profile for GINS2

    Directory of Open Access Journals (Sweden)

    Juha K. Rantala

    2010-11-01

    Full Text Available Aneuploidy is among the most obvious differences between normal and cancer cells. However, mechanisms contributing to development and maintenance of aneuploid cell growth are diverse and incompletely understood. Functional genomics analyses have shown that aneuploidy in cancer cells is correlated with diffuse gene expression signatures and aneuploidy can arise by a variety of mechanisms, including cytokinesis failures, DNA endoreplication, and possibly through polyploid intermediate states. To identify molecular processes contributing to development of aneuploidy, we used a cell spot microarray technique to identify genes inducing polyploidy and/or allowing maintenance of polyploid cell growth in breast cancer cells. Of 5760 human genes screened, 177 were found to induce severe DNA content alterations on prolonged transient silencing. Association with response to DNA damage stimulus and DNA repair was found to be the most enriched cellular processes among the candidate genes. Functional validation analysis of these genes highlighted GINS2 as the highest ranking candidate inducing polyploidy, accumulation of endogenous DNA damage, and impairing cell proliferation on inhibition. The cell growth inhibition and induction of polyploidy by suppression of GINS2 was verified in a panel of breast cancer cell lines. Bioinformatic analysis of published gene expression and DNA copy number studies of clinical breast tumors suggested GINS2 to be associated with the aggressive characteristics of a subgroup of breast cancers in vivo. In addition, nuclear GINS2 protein levels distinguished actively proliferating cancer cells suggesting potential use of GINS2 staining as a biomarker of cell proliferation as well as a potential therapeutic target.