WorldWideScience

Sample records for analysis techniques identifies

  1. Using machine vision and data mining techniques to identify cell properties via microfluidic flow analysis

    Science.gov (United States)

    Horowitz, Geoffrey; Bowie, Samuel; Liu, Anna; Stone, Nicholas; Sulchek, Todd; Alexeev, Alexander

    2016-11-01

    In order to quickly identify the wide range of mechanistic properties that are seen in cell populations, a coupled machine vision and data mining analysis is developed to examine high speed videos of cells flowing through a microfluidic device. The microfluidic device contains a microchannel decorated with a periodical array of diagonal ridges. The ridges compress flowing cells that results in complex cell trajectory and induces cell cross-channel drift, both depend on the cell intrinsic mechanical properties that can be used to characterize specific cell lines. Thus, the cell trajectory analysis can yield a parameter set that can serve as a unique identifier of a cell's membership to a specific cell population. By using the correlations between the cell populations and measured cell trajectories in the ridged microchannel, mechanical properties of individual cells and their specific populations can be identified via only information captured using video analysis. Financial support provided by National Science Foundation (NSF) Grant No. CMMI 1538161.

  2. Application of Principal Component Analysis to NIR Spectra of Phyllosilicates: A Technique for Identifying Phyllosilicates on Mars

    Science.gov (United States)

    Rampe, E. B.; Lanza, N. L.

    2012-01-01

    Orbital near-infrared (NIR) reflectance spectra of the martian surface from the OMEGA and CRISM instruments have identified a variety of phyllosilicates in Noachian terrains. The types of phyllosilicates present on Mars have important implications for the aqueous environments in which they formed, and, thus, for recognizing locales that may have been habitable. Current identifications of phyllosilicates from martian NIR data are based on the positions of spectral absorptions relative to laboratory data of well-characterized samples and from spectral ratios; however, some phyllosilicates can be difficult to distinguish from one another with these methods (i.e. illite vs. muscovite). Here we employ a multivariate statistical technique, principal component analysis (PCA), to differentiate between spectrally similar phyllosilicate minerals. PCA is commonly used in a variety of industries (pharmaceutical, agricultural, viticultural) to discriminate between samples. Previous work using PCA to analyze raw NIR reflectance data from mineral mixtures has shown that this is a viable technique for identifying mineral types, abundances, and particle sizes. Here, we evaluate PCA of second-derivative NIR reflectance data as a method for classifying phyllosilicates and test whether this method can be used to identify phyllosilicates on Mars.

  3. MALDI-TOF and SELDI-TOF analysis: “tandem” techniques to identify potential biomarker in fibromyalgia

    Directory of Open Access Journals (Sweden)

    A. Lucacchini

    2011-11-01

    Full Text Available Fibromyalgia (FM is characterized by the presence of chronic widespread pain throughout the musculoskeletal system and diffuse tenderness. Unfortunately, no laboratory tests have been appropriately validated for FM and correlated with the subsets and activity. The aim of this study was to apply a proteomic technique in saliva of FM patients: the Surface Enhance Laser Desorption/Ionization Time-of-Flight (SELDI-TOF. For this study, 57 FM patients and 35 HC patients were enrolled. The proteomic analysis of saliva was carried out using SELDI-TOF. The analysis was performed using different chip arrays with different characteristics of binding. The statistical analysis was performed using cluster analysis and the difference between two groups was underlined using Student’s t-test. Spectra analysis highlighted the presence of several peaks differently expressed in FM patients compared with controls. The preliminary results obtained by SELDI-TOF analysis were compared with those obtained in our previous study performed on whole saliva of FM patients by using electrophoresis. The m/z of two peaks, increased in FM patients, seem to overlap well with the molecular weight of calgranulin A and C and Rho GDP-dissociation inhibitor 2, which we had found up-regulated in our previous study. These preliminary results showed the possibility of identifying potential salivary biomarker through salivary proteomic analysis with MALDI-TOF and SELDI-TOF in FM patients. The peaks observed allow us to focus on some of the particular pathogenic aspects of FM, the oxidative stress which contradistinguishes this condition, the involvement of proteins related to the cytoskeletal arrangements, and central sensibilization.

  4. Application of gene network analysis techniques identifies AXIN1/PDIA2 and endoglin haplotypes associated with bicuspid aortic valve.

    Directory of Open Access Journals (Sweden)

    Eric C Wooten

    Full Text Available Bicuspid Aortic Valve (BAV is a highly heritable congenital heart defect. The low frequency of BAV (1% of general population limits our ability to perform genome-wide association studies. We present the application of four a priori SNP selection techniques, reducing the multiple-testing penalty by restricting analysis to SNPs relevant to BAV in a genome-wide SNP dataset from a cohort of 68 BAV probands and 830 control subjects. Two knowledge-based approaches, CANDID and STRING, were used to systematically identify BAV genes, and their SNPs, from the published literature, microarray expression studies and a genome scan. We additionally tested Functionally Interpolating SNPs (fitSNPs present on the array; the fourth consisted of SNPs selected by Random Forests, a machine learning approach. These approaches reduced the multiple testing penalty by lowering the fraction of the genome probed to 0.19% of the total, while increasing the likelihood of studying SNPs within relevant BAV genes and pathways. Three loci were identified by CANDID, STRING, and fitSNPS. A haplotype within the AXIN1-PDIA2 locus (p-value of 2.926x10(-06 and a haplotype within the Endoglin gene (p-value of 5.881x10(-04 were found to be strongly associated with BAV. The Random Forests approach identified a SNP on chromosome 3 in association with BAV (p-value 5.061x10(-06. The results presented here support an important role for genetic variants in BAV and provide support for additional studies in well-powered cohorts. Further, these studies demonstrate that leveraging existing expression and genomic data in the context of GWAS studies can identify biologically relevant genes and pathways associated with a congenital heart defect.

  5. Mid infrared and fluorescence spectroscopies coupled with factorial discriminant analysis technique to identify sheep milk from different feeding systems

    OpenAIRE

    Karoui, Romdhane; Hammami, Moncef; Rouissi, Hamadi; Blecker, Christophe

    2011-01-01

    Mid infrared spectroscopy (MIR) combined with multivariate data analysis was used to discriminate between ewes milk samples according to their feeding systems (controls, ewes fed scotch bean and ewes fed soybean). The MIR spectra were scanned throughout the first 11 weeks of the lactation stage. When factorial discriminant analysis (FDA) with leave one-out cross-validation was applied, separately, to the three spectral regions in the MIR (i.e. 3000-2800, 1700-1500 and 1500-900 cm(-1)), the cl...

  6. Proof-of-principle results for identifying the composition of dust particles and volcanic ash samples through the technique of photon activation analysis at the IAC

    Science.gov (United States)

    Mamtimin, Mayir; Cole, Philip L.; Segebade, Christian

    2013-04-01

    Instrumental analytical methods are preferable in studying sub-milligram quantities of airborne particulates collected in dust filters. The multi-step analytical procedure used in treating samples through chemical separation can be quite complicated. Further, due to the minute masses of the airborne particulates collected on filters, such chemical treatment can easily lead to significant levels of contamination. Radio-analytical techniques, and in particular, activation analysis methods offer a far cleaner alternative. Activation methods require minimal sample preparation and provide sufficient sensitivity for detecting the vast majority of the elements throughout the periodic table. In this paper, we will give a general overview of the technique of photon activation analysis. We will show that by activating dust particles with 10- to 30-MeV bremsstrahlung photons, we can ascertain their elemental composition. The samples are embedded in dust-collection filters and are irradiated "as is" by these photons. The radioactivity of the photonuclear reaction products is measured with appropriate spectrometers and the respective analytes are quantified using multi-component calibration materials. We shall provide specific examples of identifying the elemental components of airborne dust particles and volcanic ash by making use of bremsstrahlung photons from an electron linear accelerator at the Idaho Accelerator Center in Pocatello, Idaho.

  7. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  8. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  9. An evaluation of object-oriented image analysis techniques to identify motorized vehicle effects in semi-arid to arid ecosystems of the American West

    Science.gov (United States)

    Mladinich, C.

    2010-01-01

    Human disturbance is a leading ecosystem stressor. Human-induced modifications include transportation networks, areal disturbances due to resource extraction, and recreation activities. High-resolution imagery and object-oriented classification rather than pixel-based techniques have successfully identified roads, buildings, and other anthropogenic features. Three commercial, automated feature-extraction software packages (Visual Learning Systems' Feature Analyst, ENVI Feature Extraction, and Definiens Developer) were evaluated by comparing their ability to effectively detect the disturbed surface patterns from motorized vehicle traffic. Each package achieved overall accuracies in the 70% range, demonstrating the potential to map the surface patterns. The Definiens classification was more consistent and statistically valid. Copyright ?? 2010 by Bellwether Publishing, Ltd. All rights reserved.

  10. New technique for identifying varieties resistance to rice blast

    Institute of Scientific and Technical Information of China (English)

    ZHUPeiliang

    1994-01-01

    After 8 yrs lab experiments and field tests, an advanced technique for identifying varieties resistance to rice blast was developed by a research group in Plant Protection Institute, Zhejiang Academy of AgricuLltural Sciences. With this technique, the inoculum was prepared on a maizc-rice-straw-agar media which was suitable for sporulation of most rice blast pathogen isolates.

  11. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  12. Identifying fly puparia by clearing technique: application to forensic entomology.

    Science.gov (United States)

    Sukontason, Kabkaew L; Ngern-Klun, Radchadawan; Sripakdee, Duanghatai; Sukontason, Kom

    2007-10-01

    In forensic investigations, immature stages of the fly (egg, larva, or puparia) can be used as entomological evidence at death scenes, not only to estimate the postmortem interval (PMI), analyze toxic substances, and to determine the manner of death but also to indicate the movement of a corpse in homicide cases. Of these immature stages, puparia represent the longest developmental time, which makes them of useful. However, in order for forensic entomologists to use puparia effectively, it is crucial that they are able to accurately identify the species of fly found in a corpse. Typically, these puparia are similar in general appearance, being coarctate and light brown to dark brown in color, which makes identification difficult. In this study, we report on the clearing technique used to pale the integument of fly puparia, thereby allowing observation of the anterior end (second to fourth segments) and the profile of the posterior spiracle, which are important clues for identification. We used puparia of the blowfly, Chrysomya megacephala (F.), as the model species in this experiment. With placement in a 20% potassium hydroxide solution daily and mounting on a clearing medium (Permount(R), New Jersey), the profile of the posterior spiracle could be clearly examined under a light microscope beginning on the fifth day after pupation, and the number of papillae in the anterior spiracle could be counted easily starting from the ninth day. Comparison of morphological features of C. megacephala puparia with those of other blowflies (Chrysomya nigripes [Aubertin], Chrysomya rufifacies [Macquart], Chrysomya villeneuvi [Patton], Lucilia cuprina [Wiedemann], and Hemipyrellia ligurriens [Wiedemann]) and a housefly (Musca domestica L.) revealed that the anterior ends and the profiles of the posterior spiracles had markedly distinguishing characteristics. Morphometric analysis of the length and width of puparia, along with the length of the gaps between the posterior spiracles

  13. Image Techniques for Identifying Sea-Ice Parameters

    Directory of Open Access Journals (Sweden)

    Qin Zhang

    2014-10-01

    Full Text Available The estimation of ice forces are critical to Dynamic Positioning (DP operations in Arctic waters. Ice conditions are important for the analysis of ice-structure interaction in an ice field. To monitor sea-ice conditions, cameras are used as field observation sensors on mobile sensor platforms in Arctic. Various image processing techniques, such as Otsu thresholding, k-means clustering, distance transform, Gradient Vector Flow (GVF Snake, mathematical morphology, are then applied to obtain ice concentration, ice types, and floe size distribution from sea-ice images to ensure safe operations of structures in ice covered regions. Those techniques yield acceptable results, and their effectiveness are demonstrated in case studies.

  14. Review on Identify Kin Relationship Technique in Image

    Directory of Open Access Journals (Sweden)

    Deepak M Ahire

    2015-06-01

    Full Text Available In this paper work Kin relationships are traditionally defined as ties based on blood . Kinship include lineal generational bonds like children, parents, grandparents, and great-grandparents, collateral bonds such as siblings, cousins, nieces and nephews, and aunts and uncles, and ties with in-laws. An often-made distinction is that between primary kin members of the families of origin and procreation and secondary kin other family members. The former refer to as “immediate family,” and the latter are generally labelled “extended family.” Marriage, as a principle of kinship, differs from blood in that it can be terminated. Here Proposing the technique to identify Kin relationship System or Kinship model by using face recognition technique splitting the face into subsets like forehead, eyes, nose, mouth, and cheek areas constitute through Gabor Features on available Real time Database. Given the potential for marital break-up, blood is recognized as the more important principle of kinship.

  15. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  16. Technique for identifying, tracing, or tracking objects in image data

    Science.gov (United States)

    Anderson, Robert J.; Rothganger, Fredrick

    2012-08-28

    A technique for computer vision uses a polygon contour to trace an object. The technique includes rendering a polygon contour superimposed over a first frame of image data. The polygon contour is iteratively refined to more accurately trace the object within the first frame after each iteration. The refinement includes computing image energies along lengths of contour lines of the polygon contour and adjusting positions of the contour lines based at least in part on the image energies.

  17. Identifying subgroups of patients using latent class analysis

    DEFF Research Database (Denmark)

    Nielsen, Anne Mølgaard; Kent, Peter; Hestbæk, Lise

    2017-01-01

    BACKGROUND: Heterogeneity in patients with low back pain (LBP) is well recognised and different approaches to subgrouping have been proposed. Latent Class Analysis (LCA) is a statistical technique that is increasingly being used to identify subgroups based on patient characteristics. However, as ...

  18. Identifying clinical course patterns in SMS data using cluster analysis

    DEFF Research Database (Denmark)

    Kent, Peter; Kongsted, Alice

    2012-01-01

    ABSTRACT: BACKGROUND: Recently, there has been interest in using the short message service (SMS or text messaging), to gather frequent information on the clinical course of individual patients. One possible role for identifying clinical course patterns is to assist in exploring clinically importa...... of cluster analysis. More research is needed, especially head-to-head studies, to identify which technique is best to use under what circumstances.......ABSTRACT: BACKGROUND: Recently, there has been interest in using the short message service (SMS or text messaging), to gather frequent information on the clinical course of individual patients. One possible role for identifying clinical course patterns is to assist in exploring clinically important...... by spline analysis. However, cluster analysis of SMS data in its original untransformed form may be simpler and offer other advantages. Therefore, the aim of this study was to determine whether cluster analysis could be used for identifying clinical course patterns distinct from the pattern of the whole...

  19. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  20. Triangulation of Data Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Lauri, M

    2011-10-01

    Full Text Available In psychology, as in other disciplines, the concepts of validity and reliability are considered essential to give an accurate interpretation of results. While in quantitative research the idea is well established, in qualitative research, validity and reliability take on a different dimension. Researchers like Miles and Huberman (1994 and Silverman (2000, 2001, have shown how these issues are addressed in qualitative research. In this paper I am proposing that the same corpus of data, in this case the transcripts of focus group discussions, can be analysed using more than one data analysis technique. I refer to this idea as ‘triangulation of data analysis techniques’ and argue that such triangulation increases the reliability of the results. If the results obtained through a particular data analysis technique, for example thematic analysis, are congruent with the results obtained by analysing the same transcripts using a different technique, for example correspondence analysis, it is reasonable to argue that the analysis and interpretation of the data is valid.

  1. Identifying irradiated flours by photo-stimulated luminescence technique

    Science.gov (United States)

    Ramli, Ros Anita Ahmad; Yasir, Muhamad Samudi; Othman, Zainon; Abdullah, Wan Saffiey Wan

    2014-02-01

    Photo-stimulated luminescence (PSL) technique was used in this study to detect gamma irradiation treatment of five types of flours (corn, rice, tapioca, wheat and glutinous rice) at four different doses 0, 0.2, .05 and 1kGy. The signal level was compared with two threshold values (700 and 5000). With the exception of glutinous rice, all irradiated samples produced a strong signal above the upper threshold (5000 counts/60s). All control samples produced negative result with the signals below the lower threshold (700 counts/60s) suggesting that the samples have not been irradiated. Irradiated glutinous rice samples produced intermediate signals (700 - 5000 counts/60s) which were subsequently confirmed using calibrated PSL. The PSL signals remained stable after 90 days of storage. The findings of this study will be useful to facilitate control of food irradiation application in Malaysia.

  2. Identifying irradiated flours by photo-stimulated luminescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Ramli, Ros Anita Ahmad; Yasir, Muhamad Samudi [Faculty of Science and Technology, National University of Malaysia, Bangi, 43000 Kajang, Selangor (Malaysia); Othman, Zainon; Abdullah, Wan Saffiey Wan [Malaysian Nuclear Agency, Bangi 43000 Kajang, Selangor (Malaysia)

    2014-02-12

    Photo-stimulated luminescence (PSL) technique was used in this study to detect gamma irradiation treatment of five types of flours (corn, rice, tapioca, wheat and glutinous rice) at four different doses 0, 0.2, .05 and 1kGy. The signal level was compared with two threshold values (700 and 5000). With the exception of glutinous rice, all irradiated samples produced a strong signal above the upper threshold (5000 counts/60s). All control samples produced negative result with the signals below the lower threshold (700 counts/60s) suggesting that the samples have not been irradiated. Irradiated glutinous rice samples produced intermediate signals (700 - 5000 counts/60s) which were subsequently confirmed using calibrated PSL. The PSL signals remained stable after 90 days of storage. The findings of this study will be useful to facilitate control of food irradiation application in Malaysia.

  3. Identifying indicators through modified Delphi technique in polytechnics system

    Science.gov (United States)

    Nashir, Irdayanti Mat; Mustapha, Ramlee; Yusoff, Abdullah

    2015-02-01

    This study aims to examine how the panel has been selected as experts in assessing indicators of innovative instructional leadership (IIL) administrator in polytechnics based on 222 items were obtained through previous studies. A total of eleven people were selected as the expert panels in a study where expert selection criteria based on their background in the leadership. Experts were interviewed separately. Interviews were carried out for a half hour in their offices. The data obtained were analyzed using Atlas Ti. Overall findings indicate experts agree that a total of 188 items and 14 indicators should be maintained in this innovative instructional leadership instruments and next by using Modified Delphi Technique. The instrument will then be analyzed to obtain findings on the perception of lecturers on every administrator innovative instructional leadership in their respective polytechnics.

  4. Factor analysis identifies subgroups of constipation

    Institute of Scientific and Technical Information of China (English)

    Philip G Dinning; Mike Jones; Linda Hunt; Sergio E Fuentealba; Jamshid Kalanter; Denis W King; David Z Lubowski; Nicholas J Talley; Ian J Cook

    2011-01-01

    AIM: To determine whether distinct symptom groupings exist in a constipated population and whether such grouping might correlate with quantifiable pathophysiological measures of colonic dysfunction. METHODS: One hundred and ninety-one patients presenting to a Gastroenterology clinic with constipation and 32 constipated patients responding to a newspaper advertisement completed a 53-item, wide-ranging selfreport questionnaire. One hundred of these patients had colonic transit measured scintigraphically. Factor analysis determined whether constipation-related symptoms grouped into distinct aspects of symptomatology. Cluster analysis was used to determine whether individual patients naturally group into distinct subtypes. RESULTS: Cluster analysis yielded a 4 cluster solution with the presence or absence of pain and laxative unresponsiveness providing the main descriptors. Amongst all clusters there was a considerable proportion of patients with demonstrable delayed colon transit, irritable bowel syndrome positive criteria and regular stool frequency. The majority of patients with these characteristics also reported regular laxative use. CONCLUSION: Factor analysis identified four constipation subgroups, based on severity and laxative unresponsiveness, in a constipated population. However, clear stratification into clinically identifiable groups remains imprecise.

  5. Automatically identifying scatter in fluorescence data using robust techniques

    DEFF Research Database (Denmark)

    Engelen, S.; Frosch, Stina; Hubert, M.

    2007-01-01

    First and second order Rayleigh and Raman scatter is a common problem when fitting Parallel Factor Analysis (PARAFAC) to fluorescence excitation-emission data (EEM). The scatter does not contain any relevant chemical information and does not conform to the low-rank trilinear model. The scatter...... is developed based on robust statistical methods. The method does not demand any visual inspection of the data prior to modeling, and can handle first and second order Rayleigh scatter as well as Raman scatter in various types of EEM data. The results of the automated scatter identification method were used...... as input data for three different PARAFAC methods. Firstly inserting missing values in the scatter regions are tested, secondly an interpolation of the scatter regions is performed and finally the scatter regions are down-weighted. These results show that the PARAFAC method to choose after scatter...

  6. Study of techniques of identifying the earthquake precursory anomalies in terms of mathematical modeling

    Institute of Scientific and Technical Information of China (English)

    YAN Zun-guo; QIAN Jia-dong; CHEN Jun-hua; LI Sheng-le

    2000-01-01

    This paper deals mainly with the key technique of identifying the anomalous signals without distortion, which might be the precursors associated with earthquakes, from the real time series of observations that would be usually the mixture of the anomalous signals, the normal background variations, some interference and noises. The key technique of 2 un-biased estimation2 is to construct an empirical time series and set up the criterion for identifying the anomalous variation on the bases of time series analysis. To the end of testing the method, a man-made time series including the normal variations and random interference as well as specific anomaly, has been constructed. And the test of picking up the anomaly has been conducted with the intuitive and effective way of identifying the anomalous signal from a complicated time series. Test results confirms that the techniques under discussion are effective and applicable, and the signals extracted from the analysis, could be clear and precise, and is almost similar to the known simulated anomalous signals in the experiments.

  7. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  8. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  9. A semi-automated single day image differencing technique to identify animals in aerial imagery.

    Directory of Open Access Journals (Sweden)

    Pat Terletzky

    Full Text Available Our research presents a proof-of-concept that explores a new and innovative method to identify large animals in aerial imagery with single day image differencing. We acquired two aerial images of eight fenced pastures and conducted a principal component analysis of each image. We then subtracted the first principal component of the two pasture images followed by heuristic thresholding to generate polygons. The number of polygons represented the number of potential cattle (Bos taurus and horses (Equus caballus in the pasture. The process was considered semi-automated because we were not able to automate the identification of spatial or spectral thresholding values. Imagery was acquired concurrently with ground counts of animal numbers. Across the eight pastures, 82% of the animals were correctly identified, mean percent commission was 53%, and mean percent omission was 18%. The high commission error was due to small mis-alignments generated from image-to-image registration, misidentified shadows, and grouping behavior of animals. The high probability of correctly identifying animals suggests short time interval image differencing could provide a new technique to enumerate wild ungulates occupying grassland ecosystems, especially in isolated or difficult to access areas. To our knowledge, this was the first attempt to use standard change detection techniques to identify and enumerate large ungulates.

  10. A semi-automated single day image differencing technique to identify animals in aerial imagery.

    Science.gov (United States)

    Terletzky, Pat; Ramsey, Robert Douglas

    2014-01-01

    Our research presents a proof-of-concept that explores a new and innovative method to identify large animals in aerial imagery with single day image differencing. We acquired two aerial images of eight fenced pastures and conducted a principal component analysis of each image. We then subtracted the first principal component of the two pasture images followed by heuristic thresholding to generate polygons. The number of polygons represented the number of potential cattle (Bos taurus) and horses (Equus caballus) in the pasture. The process was considered semi-automated because we were not able to automate the identification of spatial or spectral thresholding values. Imagery was acquired concurrently with ground counts of animal numbers. Across the eight pastures, 82% of the animals were correctly identified, mean percent commission was 53%, and mean percent omission was 18%. The high commission error was due to small mis-alignments generated from image-to-image registration, misidentified shadows, and grouping behavior of animals. The high probability of correctly identifying animals suggests short time interval image differencing could provide a new technique to enumerate wild ungulates occupying grassland ecosystems, especially in isolated or difficult to access areas. To our knowledge, this was the first attempt to use standard change detection techniques to identify and enumerate large ungulates.

  11. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    Science.gov (United States)

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style

  12. Lidar point density analysis: implications for identifying water bodies

    Science.gov (United States)

    Worstell, Bruce B.; Poppenga, Sandra; Evans, Gayla A.; Prince, Sandra

    2014-01-01

    Most airborne topographic light detection and ranging (lidar) systems operate within the near-infrared spectrum. Laser pulses from these systems frequently are absorbed by water and therefore do not generate reflected returns on water bodies in the resulting void regions within the lidar point cloud. Thus, an analysis of lidar voids has implications for identifying water bodies. Data analysis techniques to detect reduced lidar return densities were evaluated for test sites in Blackhawk County, Iowa, and Beltrami County, Minnesota, to delineate contiguous areas that have few or no lidar returns. Results from this study indicated a 5-meter radius moving window with fewer than 23 returns (28 percent of the moving window) was sufficient for delineating void regions. Techniques to provide elevation values for void regions to flatten water features and to force channel flow in the downstream direction also are presented.

  13. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  14. Identifying glioblastoma gene networks based on hypergeometric test analysis.

    Directory of Open Access Journals (Sweden)

    Vasileios Stathias

    Full Text Available Patient specific therapy is emerging as an important possibility for many cancer patients. However, to identify such therapies it is essential to determine the genomic and transcriptional alterations present in one tumor relative to control samples. This presents a challenge since use of a single sample precludes many standard statistical analysis techniques. We reasoned that one means of addressing this issue is by comparing transcriptional changes in one tumor with those observed in a large cohort of patients analyzed by The Cancer Genome Atlas (TCGA. To test this directly, we devised a bioinformatics pipeline to identify differentially expressed genes in tumors resected from patients suffering from the most common malignant adult brain tumor, glioblastoma (GBM. We performed RNA sequencing on tumors from individual GBM patients and filtered the results through the TCGA database in order to identify possible gene networks that are overrepresented in GBM samples relative to controls. Importantly, we demonstrate that hypergeometric-based analysis of gene pairs identifies gene networks that validate experimentally. These studies identify a putative workflow for uncovering differentially expressed patient specific genes and gene networks for GBM and other cancers.

  15. Prefractionation techniques in proteome analysis.

    Science.gov (United States)

    Righetti, Pier Giorgio; Castagna, Annalisa; Herbert, Ben; Reymond, Frederic; Rossier, Joël S

    2003-08-01

    The present review deals with a number of prefractionation protocols in preparation for two-dimensional map analysis, both in the fields of chromatography and in the field of electrophoresis. In the first case, Fountoulaki's groups has reported just about any chromatographic procedure useful as a prefractionation step, including affinity, ion-exchange, and reversed-phase resins. As a result of the various enrichment steps, several hundred new species, previously undetected in unfractionated samples, could be revealed for the first time. Electrophoretic prefractionation protocols include all those electrokinetic methodologies which are performed in free solution, essentially all relying on isoelectric focusing steps. The devices here reviewed include multichamber apparatus, such as the multicompartment electrolyzer with Immobiline membranes, Off-Gel electrophoresis in a multicup device and the Rotofor, an instrument also based on a multichamber system but exploiting the conventional technique of carrier-ampholyte-focusing. Other instruments of interest are the Octopus, a continuous-flow device for isoelectric focusing in a upward flowing liquid curtain, and the Gradiflow, where different pI cuts are obtained by a multistep passage through two compartments buffered at different pH values. It is felt that this panoply of methods could offer a strong step forward in "mining below the tip of the iceberg" for detecting the "unseen proteome".

  16. Análise comparativa de fragmentos identificáveis de forrageiras, pela técnica micro-histológica Comparative analysis of identifiable fragments of forages, by the microhistological technique

    Directory of Open Access Journals (Sweden)

    Maristela de Oliveira Bauer

    2005-12-01

    Full Text Available Objetivou-se, com este trabalho, verificar, pela técnica micro-histológica, diferenças entre espécies forrageiras quanto ao percentual de fragmentos identificáveis, em função do processo digestivo e da época do ano. Lâminas foliares frescas recém-expandidas, correspondentes à última e à penúltima posição no perfilho, das espécies Melinis minutiflora Pal. de Beauv (capim-gordura, Hyparrhenia rufa (Nees Stapf. (capim-jaraguá, Brachiaria decumbens Stapf. (capim-braquiária, Imperata brasiliensis Trin. (capim-sapé, de Medicago sativa L. (alfafa e de Schinus terebenthifolius Raddi (aroeira, amostradas nos períodos chuvoso e seco, foram digeridas in vitro e preparadas de acordo com a técnica micro-histológica. Observou-se que as espécies apresentaram diferenças marcantes na porcentagem de fragmentos identificáveis e que a digestão alterou estas porcentagens em torno de 10 %; que o período de amos­tragem não influenciou a porcentagem de fragmentos identificáveis para a maioria das espécies; que a presença de pigmentos e a adesão da epiderme às células dos tecidos internos da folha prejudicaram a identificação dos fragmentos; e que a digestão melhorou a visualização dos fragmentos dos capins sapé e jaraguá e da aroeira, mas prejudicou a do capim-braquiária e, principalmente, a da alfafa.The objetive of this study was to verify differences among forages species in relation to the percentage of identifiable fragment as affected by the digestion process and season. Fresh last expanded leaf lamina samples of the species Melinis minutiflora Pal. de Beauv (Molassesgrass, Hyparrhenia rufa (Nees Stapf. (Jaraguagrass, Brachiaria decumbens Stapf. (Signalgrass, Imperata brasilienses Trin. (Sapegrass, and foliar laminas of Medicago sativa L. (Alfalfa and Schinus terebenthifolius Raddi (Aroeira, sampled in the rainy and dry seasons, were digested in vitro and prepared according to the microhistological technique. The

  17. Identifiable Data Files - Medicare Provider Analysis and ...

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Provider Analysis and Review (MEDPAR) File contains data from claims for services provided to beneficiaries admitted to Medicare certified inpatient...

  18. Identifying Proper Names Based on Association Analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The issue of proper names recognition in Chinese text was discussed. An automatic approach based on association analysis to extract rules from corpus was presented. The method tries to discover rules relevant to external evidence by association analysis, without additional manual effort. These rules can be used to recognize the proper nouns in Chinese texts. The experimental result shows that our method is practical in some applications.Moreover, the method is language independent.

  19. Complex network based techniques to identify extreme events and (sudden) transitions in spatio-temporal systems

    CERN Document Server

    Marwan, Norbert

    2015-01-01

    We present here two promising techniques for the application of the complex network approach to continuous spatio-temporal systems that have been developed in the last decade and show large potential for future application and development of complex systems analysis. First, we discuss the transforming of a time series from such systems to a complex network. The natural approach is to calculate the recurrence matrix and interpret such as the adjacency matrix of an associated complex network, called recurrence network. Using complex network measures, such as transitivity coefficient, we demonstrate that this approach is very efficient for identifying qualitative transitions in observational data, e.g., when analyzing paleoclimate regime transitions. Second, we demonstrate the use of directed spatial networks constructed from spatio-temporal measurements of such systems that can be derived from the synchronized-in-time occurrence of extreme events in different spatial regions. Although there are many possibiliti...

  20. Adhesive polypeptides of Staphylococcus aureus identified using a novel secretion library technique in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Holm Liisa

    2011-05-01

    Full Text Available Abstract Background Bacterial adhesive proteins, called adhesins, are frequently the decisive factor in initiation of a bacterial infection. Characterization of such molecules is crucial for the understanding of bacterial pathogenesis, design of vaccines and development of antibacterial drugs. Because adhesins are frequently difficult to express, their characterization has often been hampered. Alternative expression methods developed for the analysis of adhesins, e.g. surface display techniques, suffer from various drawbacks and reports on high-level extracellular secretion of heterologous proteins in Gram-negative bacteria are scarce. These expression techniques are currently a field of active research. The purpose of the current study was to construct a convenient, new technique for identification of unknown bacterial adhesive polypeptides directly from the growth medium of the Escherichia coli host and to identify novel proteinaceous adhesins of the model organism Staphylococcus aureus. Results Randomly fragmented chromosomal DNA of S. aureus was cloned into a unique restriction site of our expression vector, which facilitates secretion of foreign FLAG-tagged polypeptides into the growth medium of E. coli ΔfliCΔfliD, to generate a library of 1663 clones expressing FLAG-tagged polypeptides. Sequence and bioinformatics analyses showed that in our example, the library covered approximately 32% of the S. aureus proteome. Polypeptides from the growth medium of the library clones were screened for binding to a selection of S. aureus target molecules and adhesive fragments of known staphylococcal adhesins (e.g coagulase and fibronectin-binding protein A as well as polypeptides of novel function (e.g. a universal stress protein and phosphoribosylamino-imidazole carboxylase ATPase subunit were detected. The results were further validated using purified His-tagged recombinant proteins of the corresponding fragments in enzyme-linked immunoassay and

  1. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  2. Identifying MMORPG Bots: A Traffic Analysis Approach

    Directory of Open Access Journals (Sweden)

    Wen-Chin Chen

    2008-11-01

    Full Text Available Massively multiplayer online role playing games (MMORPGs have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1 the regularity in the release time of client commands, 2 the trend and magnitude of traffic burstiness in multiple time scales, and 3 the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  3. Identifying MMORPG Bots: A Traffic Analysis Approach

    Science.gov (United States)

    Chen, Kuan-Ta; Jiang, Jhih-Wei; Huang, Polly; Chu, Hao-Hua; Lei, Chin-Laung; Chen, Wen-Chin

    2008-12-01

    Massively multiplayer online role playing games (MMORPGs) have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1) the regularity in the release time of client commands, 2) the trend and magnitude of traffic burstiness in multiple time scales, and 3) the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  4. Identifying nonlinear biomechanical models by multicriteria analysis

    Science.gov (United States)

    Srdjevic, Zorica; Cveticanin, Livija

    2012-02-01

    In this study, the methodology developed by Srdjevic and Cveticanin (International Journal of Industrial Ergonomics 34 (2004) 307-318) for the nonbiased (objective) parameter identification of the linear biomechanical model exposed to vertical vibrations is extended to the identification of n-degree of freedom (DOF) nonlinear biomechanical models. The dynamic performance of the n-DOF nonlinear model is described in terms of response functions in the frequency domain, such as the driving-point mechanical impedance and seat-to-head transmissibility function. For randomly generated parameters of the model, nonlinear equations of motion are solved using the Runge-Kutta method. The appropriate data transformation from the time-to-frequency domain is performed by a discrete Fourier transformation. Squared deviations of the response functions from the target values are used as the model performance evaluation criteria, thus shifting the problem into the multicriteria framework. The objective weights of criteria are obtained by applying the Shannon entropy concept. The suggested methodology is programmed in Pascal and tested on a 4-DOF nonlinear lumped parameter biomechanical model. The identification process over the 2000 generated sets of parameters lasts less than 20 s. The model response obtained with the imbedded identified parameters correlates well with the target values, therefore, justifying the use of the underlying concept and the mathematical instruments and numerical tools applied. It should be noted that the identified nonlinear model has an improved accuracy of the biomechanical response compared to the accuracy of a linear model.

  5. Identifying the Professional Development Needs of Early Career Teachers in Scotland Using Nominal Group Technique

    Science.gov (United States)

    Kennedy, Aileen; Clinton, Colleen

    2009-01-01

    This paper reports on phase 1 of a project commissioned by Learning and Teaching Scotland to explore the continuing professional development (CPD) needs of teachers in Scotland in years 2-6 of their careers. Nominal group technique (NGT) was employed to identify the CPD needs of year 2-6 teachers and to identify the relative priority of these…

  6. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  7. Gradient measurement technique to identify phase transitions in nano-dispersed liquid crystalline compounds

    Science.gov (United States)

    Pardhasaradhi, P.; Madhav, B. T. P.; Venugopala Rao, M.; Manepalli, R. K. N. R.; Pisipati, V. G. K. M.

    2016-09-01

    Characterization and phase transitions in pure and 0.5% BaTiO3 nano-dispersed liquid crystalline (LC) N-(p-n-heptyloxybenzylidene)-p-n-nonyloxy aniline, 7O.O9, com-pounds are carried out using a polarizing microscope attached with hot stage and camera. We observed that when any of these images are distorted, different local structures suffer from various degradations in a gradient magnitude. So, we examined the pixel-wise gradient magnitude similarity between the reference and distorted images combined with a novel pooling strategy - the standard deviation of the GMS map - to determine the overall phase transition variations. In this regard, MATLAB software is used for gradient measurement technique to identify the phase transitions and transition temperature of the pure and nano-dispersed LC compounds. The image analysis of this method proposed is in good agreement with the standard methods like polarizing microscope (POM) and differential scanning calorimeter (DSC). 0.5% BaTiO3 nano-dispersed 7O.O9 compound induces cholesteric phase quenching the nematic phase, which the pure compound exhibits.

  8. Identifying desertification risk areas using fuzzy membership and geospatial technique - A case study, Kota District, Rajasthan

    Science.gov (United States)

    Dasgupta, Arunima; Sastry, K. L. N.; Dhinwa, P. S.; Rathore, V. S.; Nathawat, M. S.

    2013-08-01

    Desertification risk assessment is important in order to take proper measures for its prevention. Present research intends to identify the areas under risk of desertification along with their severity in terms of degradation in natural parameters. An integrated model with fuzzy membership analysis, fuzzy rule-based inference system and geospatial techniques was adopted, including five specific natural parameters namely slope, soil pH, soil depth, soil texture and NDVI. Individual parameters were classified according to their deviation from mean. Membership of each individual values to be in a certain class was derived using the normal probability density function of that class. Thus if a single class of a single parameter is with mean μ and standard deviation σ, the values falling beyond μ + 2 σ and μ - 2 σ are not representing that class, but a transitional zone between two subsequent classes. These are the most important areas in terms of degradation, as they have the lowest probability to be in a certain class, hence highest probability to be extended or narrowed down in next or previous class respectively. Eventually, these are the values which can be easily altered, under extrogenic influences, hence are identified as risk areas. The overall desertification risk is derived by incorporating the different risk severity of each parameter using fuzzy rule-based interference system in GIS environment. Multicriteria based geo-statistics are applied to locate the areas under different severity of desertification risk. The study revealed that in Kota, various anthropogenic pressures are accelerating land deterioration, coupled with natural erosive forces. Four major sources of desertification in Kota are, namely Gully and Ravine erosion, inappropriate mining practices, growing urbanization and random deforestation.

  9. A measurement technique to identify and locate partial discharge in transformer with AE and HFCT

    Directory of Open Access Journals (Sweden)

    Urairat Fuangsoongnern

    2014-02-01

    Full Text Available This paper proposes a measurement technique to identify and locate the occurrence of partial discharge (PD in the insulation of oil immersed and dry type distribution transformers. With reference to IEEE Std. C57.127- 2007, four acoustic transducers type PD-TP500A were used to locate PD and one HFCT (High frequency current transducer was used to identify PD. This process could accurately identify and locate the source of PD occurring at any position in a distribution transformer. The result of the findings enabled us to prevent damage and deploy defensive maintenance measure on the distribution transformer in time.

  10. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  11. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    Science.gov (United States)

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  12. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  13. Attitude Exploration Using Factor Analysis Technique

    Directory of Open Access Journals (Sweden)

    Monika Raghuvanshi

    2016-12-01

    Full Text Available Attitude is a psychological variable that contains positive or negative evaluation about people or an environment. The growing generation possesses learning skills, so if positive attitude is inculcated at the right age, it might therefore become habitual. Students in the age group 14-20 years from the city of Bikaner, India, are the target population for this study. An inventory of 30Likert-type scale statements was prepared in order to measure attitude towards the environment and matters related to conservation. The primary data is collected though a structured questionnaire, using cluster sampling technique and analyzed using the IBM SPSS 23 statistical tool. Factor analysis is used to reduce 30 variables to a smaller number of more identifiable groups of variables. Results show that students “need more regulation and voluntary participation to protect the environment”, “need conservation of water and electricity”, “are concerned for undue wastage of water”, “need visible actions to protect the environment”, “need strengthening of the public transport system”, “are a little bit ignorant about the consequences of global warming”, “want prevention of water pollution by industries”, “need changing of personal habits to protect the environment”, and “don’t have firsthand experience of global warming”. Analysis revealed that nine factors obtained could explain about 58.5% variance in the attitude of secondary school students towards the environment in the city of Bikaner, India. The remaining 39.6% variance is attributed to other elements not explained by this analysis. A global campaign for improvement in attitude about environmental issues and its utility in daily lives may boost positive youth attitudes, potentially impacting worldwide. A cross-disciplinary approach may be developed by teaching along with other related disciplines such as science, economics, and social studies etc.

  14. Identifying redundancy and exposing provenance in crowdsourced data analysis.

    Science.gov (United States)

    Willett, Wesley; Ginosar, Shiry; Steinitz, Avital; Hartmann, Björn; Agrawala, Maneesh

    2013-12-01

    We present a system that lets analysts use paid crowd workers to explore data sets and helps analysts interactively examine and build upon workers' insights. We take advantage of the fact that, for many types of data, independent crowd workers can readily perform basic analysis tasks like examining views and generating explanations for trends and patterns. However, workers operating in parallel can often generate redundant explanations. Moreover, because workers have different competencies and domain knowledge, some responses are likely to be more plausible than others. To efficiently utilize the crowd's work, analysts must be able to quickly identify and consolidate redundant responses and determine which explanations are the most plausible. In this paper, we demonstrate several crowd-assisted techniques to help analysts make better use of crowdsourced explanations: (1) We explore crowd-assisted strategies that utilize multiple workers to detect redundant explanations. We introduce color clustering with representative selection--a strategy in which multiple workers cluster explanations and we automatically select the most-representative result--and show that it generates clusterings that are as good as those produced by experts. (2) We capture explanation provenance by introducing highlighting tasks and capturing workers' browsing behavior via an embedded web browser, and refine that provenance information via source-review tasks. We expose this information in an explanation-management interface that allows analysts to interactively filter and sort responses, select the most plausible explanations, and decide which to explore further.

  15. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  16. Equivalent Dynamic Stiffness Mapping technique for identifying nonlinear structural elements from frequency response functions

    Science.gov (United States)

    Wang, X.; Zheng, G. T.

    2016-02-01

    A simple and general Equivalent Dynamic Stiffness Mapping technique is proposed for identifying the parameters or the mathematical model of a nonlinear structural element with steady-state primary harmonic frequency response functions (FRFs). The Equivalent Dynamic Stiffness is defined as the complex ratio between the internal force and the displacement response of unknown element. Obtained with the test data of responses' frequencies and amplitudes, the real and imaginary part of Equivalent Dynamic Stiffness are plotted as discrete points in a three dimensional space over the displacement amplitude and the frequency, which are called the real and the imaginary Equivalent Dynamic Stiffness map, respectively. These points will form a repeatable surface as the Equivalent Dynamic stiffness is only a function of the corresponding data as derived in the paper. The mathematical model of the unknown element can then be obtained by surface-fitting these points with special functions selected by priori knowledge of the nonlinear type or with ordinary polynomials if the type of nonlinearity is not pre-known. An important merit of this technique is its capability of dealing with strong nonlinearities owning complicated frequency response behaviors such as jumps and breaks in resonance curves. In addition, this technique could also greatly simplify the test procedure. Besides there is no need to pre-identify the underlying linear parameters, the method uses the measured data of excitation forces and responses without requiring a strict control of the excitation force during the test. The proposed technique is demonstrated and validated with four classical single-degree-of-freedom (SDOF) numerical examples and one experimental example. An application of this technique for identification of nonlinearity from multiple-degree-of-freedom (MDOF) systems is also illustrated.

  17. Using text-mining techniques in electronic patient records to identify ADRs from medicine use.

    Science.gov (United States)

    Warrer, Pernille; Hansen, Ebba Holme; Juhl-Jensen, Lars; Aagaard, Lise

    2012-05-01

    This literature review included studies that use text-mining techniques in narrative documents stored in electronic patient records (EPRs) to investigate ADRs. We searched PubMed, Embase, Web of Science and International Pharmaceutical Abstracts without restrictions from origin until July 2011. We included empirically based studies on text mining of electronic patient records (EPRs) that focused on detecting ADRs, excluding those that investigated adverse events not related to medicine use. We extracted information on study populations, EPR data sources, frequencies and types of the identified ADRs, medicines associated with ADRs, text-mining algorithms used and their performance. Seven studies, all from the United States, were eligible for inclusion in the review. Studies were published from 2001, the majority between 2009 and 2010. Text-mining techniques varied over time from simple free text searching of outpatient visit notes and inpatient discharge summaries to more advanced techniques involving natural language processing (NLP) of inpatient discharge summaries. Performance appeared to increase with the use of NLP, although many ADRs were still missed. Due to differences in study design and populations, various types of ADRs were identified and thus we could not make comparisons across studies. The review underscores the feasibility and potential of text mining to investigate narrative documents in EPRs for ADRs. However, more empirical studies are needed to evaluate whether text mining of EPRs can be used systematically to collect new information about ADRs.

  18. Applications of electrochemical techniques in mineral analysis.

    Science.gov (United States)

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  19. Using Rasch Analysis to Identify Uncharacteristic Responses to Undergraduate Assessments

    Science.gov (United States)

    Edwards, Antony; Alcock, Lara

    2010-01-01

    Rasch Analysis is a statistical technique that is commonly used to analyse both test data and Likert survey data, to construct and evaluate question item banks, and to evaluate change in longitudinal studies. In this article, we introduce the dichotomous Rasch model, briefly discussing its assumptions. Then, using data collected in an…

  20. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  1. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  2. Root Cause Analysis - A Diagnostic Failure Analysis Technique for Managers

    Science.gov (United States)

    1975-03-26

    AA~ TECHNICAL REPORT RF-75-2 yAbom 0 ROOT CAUSE ANALYSIS - A DIAGNOSTIC FAILURE ANALYSIS TECHNIQUE FOR MANAGERS Augustine E. Magistro Nuclear...through 1975. rB Augustine E. Magistro has participated in root cause analysis task tem including team member and Blue Ribbon A panel reviewer, team

  3. Integrating complementary medicine literacy education into Australian medical curricula: Student-identified techniques and strategies for implementation.

    Science.gov (United States)

    Templeman, Kate; Robinson, Anske; McKenna, Lisa

    2015-11-01

    Formal medical education about complementary medicine (CM) that comprises medicinal products/treatments is required due to possible CM interactions with conventional medicines; however, few guidelines exist on design and implementation of such education. This paper reports findings of a constructivist grounded theory method study that identified key strategies for integrating CM literacy education into medical curricula. Analysis of data from interviews with 30 medical students showed that students supported a longitudinal integrative and pluralistic approach to medicine. Awareness of common patient use, evidence, and information relevant to future clinical practice were identified as focus points needed for CM literacy education. Students advocated for interactive case-based, experiential and dialogical didactic techniques that are multiprofessional and student-centred. Suggested strategies provide key elements of CM literacy within research, field-based practice, and didactic teaching over the entirety of the curriculum. CM educational strategies should address CM knowledge deficits and ultimately respond to patients' needs.

  4. Multispectral and Photoplethysmography Optical Imaging Techniques Identify Important Tissue Characteristics in an Animal Model of Tangential Burn Excision.

    Science.gov (United States)

    Thatcher, Jeffrey E; Li, Weizhi; Rodriguez-Vaqueiro, Yolanda; Squiers, John J; Mo, Weirong; Lu, Yang; Plant, Kevin D; Sellke, Eric; King, Darlene R; Fan, Wensheng; Martinez-Lorenzo, Jose A; DiMaio, J Michael

    2016-01-01

    Burn excision, a difficult technique owing to the training required to identify the extent and depth of injury, will benefit from a tool that can cue the surgeon as to where and how much to resect. We explored two rapid and noninvasive optical imaging techniques in their ability to identify burn tissue from the viable wound bed using an animal model of tangential burn excision. Photoplethysmography (PPG) imaging and multispectral imaging (MSI) were used to image the initial, intermediate, and final stages of burn excision of a deep partial-thickness burn. PPG imaging maps blood flow in the skin's microcirculation, and MSI collects the tissue reflectance spectrum in visible and infrared wavelengths of light to classify tissue based on a reference library. A porcine deep partial-thickness burn model was generated and serial tangential excision accomplished with an electric dermatome set to 1.0 mm depth. Excised eschar was stained with hematoxylin and eosin to determine the extent of burn remaining at each excision depth. We confirmed that the PPG imaging device showed significantly less blood flow where burn tissue was present, and the MSI method could delineate burn tissue in the wound bed from the viable wound bed. These results were confirmed independently by a histological analysis. We found these devices can identify the proper depth of excision, and their images could cue a surgeon as to the preparedness of the wound bed for grafting. These image outputs are expected to facilitate clinical judgment in the operating room.

  5. BIOELECTRICAL IMPEDANCE VECTOR ANALYSIS IDENTIFIES SARCOPENIA IN NURSING HOME RESIDENTS

    Science.gov (United States)

    Loss of muscle mass and water shifts between body compartments are contributing factors to frailty in the elderly. The body composition changes are especially pronounced in institutionalized elderly. We investigated the ability of single-frequency bioelectrical impedance analysis (BIA) to identify b...

  6. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  7. Identifying Potential Areas for Future Urban Development Using Gis-Based Multi Criteria Evaluation Technique

    Directory of Open Access Journals (Sweden)

    Mohammed Khalid Sabbar

    2016-01-01

    Full Text Available Malaysia likes other Asian countries has experienced rapid urbanization due to economic development, industrialization, massive migrations as well as natural population growth. This expansion particularly the unplanned has impacted negatively on farming activities and creates huge pressure arable agriculture areas. Thus, identification of potential sites for future urban development should become important issues in ensuring sustainable development. Therefore, the aim of this paper is to use GIS based multi criteria evaluation technique to identify potential areas for urban development at Balik Pulau, Penang. The study quantified spatial and temporal dynamics of land use/cover changes and identified potential areas for future development. The results indicated that large proportions of agriculture areas had been converted to built-up areas.. Urban areas increased from 1793.2 ha in 1992 to 3235.4 ha in 2002 and became 3987.8 ha in 2010. On the other hand agricultural land decreased from 6171.3ha (53.8% in 1992 to 3883 ha (35. % in 2010. The study, then, produced map showing potential sites for future urban development. The findings also indicated built-up areas would continue to encroach into flat available agricultural land which will be diminished if no restriction imposed. Thus, the information obtained from this study is useful for planners and decision makers in controlling agriculture areas and guiding new development properly.

  8. UPLC: a preeminent technique in pharmaceutical analysis.

    Science.gov (United States)

    Kumar, Ashok; Saini, Gautam; Nair, Anroop; Sharma, Rishbha

    2012-01-01

    The pharmaceutical companies today are driven to create novel and more efficient tools to discover, develop, deliver and monitor the drugs. In this contest the development of rapid chromatographic method is crucial for the analytical laboratories. In precedent decade, substantial technological advances have been done in enhancing particle chemistry performance, improving detector design and in optimizing the system, data processors and various controls of chromatographic techniques. When all was blended together, it resulted in the outstanding performance via ultra-high performance liquid chromatography (UPLC), which holds back the principle of HPLC technique. UPLC shows a dramatic enhancement in speed, resolution as well as the sensitivity of analysis by using particle size less than 2 pm and the system is operational at higher pressure, while the mobile phase could be able to run at greater linear velocities as compared to HPLC. This technique is considered as a new focal point in field of liquid chromatographic studies. This review focuses on the basic principle, instrumentation of UPLC and its advantages over HPLC, furthermore, this article emphasizes various pharmaceutical applications of this technique.

  9. A Comparative Analysis of Biomarker Selection Techniques

    Directory of Open Access Journals (Sweden)

    Nicoletta Dessì

    2013-01-01

    Full Text Available Feature selection has become the essential step in biomarker discovery from high-dimensional genomics data. It is recognized that different feature selection techniques may result in different set of biomarkers, that is, different groups of genes highly correlated to a given pathological condition, but few direct comparisons exist which quantify these differences in a systematic way. In this paper, we propose a general methodology for comparing the outcomes of different selection techniques in the context of biomarker discovery. The comparison is carried out along two dimensions: (i measuring the similarity/dissimilarity of selected gene sets; (ii evaluating the implications of these differences in terms of both predictive performance and stability of selected gene sets. As a case study, we considered three benchmarks deriving from DNA microarray experiments and conducted a comparative analysis among eight selection methods, representatives of different classes of feature selection techniques. Our results show that the proposed approach can provide useful insight about the pattern of agreement of biomarker discovery techniques.

  10. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  11. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  12. Latent cluster analysis of ALS phenotypes identifies prognostically differing groups.

    Directory of Open Access Journals (Sweden)

    Jeban Ganesalingam

    Full Text Available BACKGROUND: Amyotrophic lateral sclerosis (ALS is a degenerative disease predominantly affecting motor neurons and manifesting as several different phenotypes. Whether these phenotypes correspond to different underlying disease processes is unknown. We used latent cluster analysis to identify groupings of clinical variables in an objective and unbiased way to improve phenotyping for clinical and research purposes. METHODS: Latent class cluster analysis was applied to a large database consisting of 1467 records of people with ALS, using discrete variables which can be readily determined at the first clinic appointment. The model was tested for clinical relevance by survival analysis of the phenotypic groupings using the Kaplan-Meier method. RESULTS: The best model generated five distinct phenotypic classes that strongly predicted survival (p<0.0001. Eight variables were used for the latent class analysis, but a good estimate of the classification could be obtained using just two variables: site of first symptoms (bulbar or limb and time from symptom onset to diagnosis (p<0.00001. CONCLUSION: The five phenotypic classes identified using latent cluster analysis can predict prognosis. They could be used to stratify patients recruited into clinical trials and generating more homogeneous disease groups for genetic, proteomic and risk factor research.

  13. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhof, Hendrik, E-mail: Hendrik.Kohlhof@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Heidt, Christoph, E-mail: Christoph.heidt@kispi.uzh.ch [Department of Orthopedic Surgery, University Children' s Hospital Zurich, Steinwiesstrasse 74, 8032 Switzerland (Switzerland); Bähler, Alexandrine, E-mail: Alexandrine.baehler@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Kohl, Sandro, E-mail: sandro.kohl@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Gravius, Sascha, E-mail: sascha.gravius@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Friedrich, Max J., E-mail: Max.Friedrich@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Ziebarth, Kai, E-mail: kai.ziebarth@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Stranzinger, Enno, E-mail: Enno.Stranzinger@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland)

    2015-06-15

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  14. Comparative Analysis of Hand Gesture Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Arpana K. Patel

    2015-03-01

    Full Text Available During past few years, human hand gesture for interaction with computing devices has continues to be active area of research. In this paper survey of hand gesture recognition is provided. Hand Gesture Recognition is contained three stages: Pre-processing, Feature Extraction or matching and Classification or recognition. Each stage contains different methods and techniques. In this paper define small description of different methods used for hand gesture recognition in existing system with comparative analysis of all method with its benefits and drawbacks are provided.

  15. COSIMA data analysis using multivariate techniques

    Directory of Open Access Journals (Sweden)

    J. Silén

    2014-08-01

    Full Text Available We describe how to use multivariate analysis of complex TOF-SIMS spectra introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a crossvalidation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  16. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  17. Application of Electromigration Techniques in Environmental Analysis

    Science.gov (United States)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  18. A numerical comparison of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  19. Rice Transcriptome Analysis to Identify Possible Herbicide Quinclorac Detoxification Genes

    Directory of Open Access Journals (Sweden)

    Wenying eXu

    2015-09-01

    Full Text Available Quinclorac is a highly selective auxin-type herbicide, and is widely used in the effective control of barnyard grass in paddy rice fields, improving the world’s rice yield. The herbicide mode of action of quinclorac has been proposed and hormone interactions affect quinclorac signaling. Because of widespread use, quinclorac may be transported outside rice fields with the drainage waters, leading to soil and water pollution and environmental health problems.In this study, we used 57K Affymetrix rice whole-genome array to identify quinclorac signaling response genes to study the molecular mechanisms of action and detoxification of quinclorac in rice plants. Overall, 637 probe sets were identified with differential expression levels under either 6 or 24 h of quinclorac treatment. Auxin-related genes such as GH3 and OsIAAs responded to quinclorac treatment. Gene Ontology analysis showed that genes of detoxification-related family genes were significantly enriched, including cytochrome P450, GST, UGT, and ABC and drug transporter genes. Moreover, real-time RT-PCR analysis showed that top candidate P450 families such as CYP81, CYP709C and CYP72A genes were universally induced by different herbicides. Some Arabidopsis genes for the same P450 family were up-regulated under quinclorac treatment.We conduct rice whole-genome GeneChip analysis and the first global identification of quinclorac response genes. This work may provide potential markers for detoxification of quinclorac and biomonitors of environmental chemical pollution.

  20. Identifying Sources of Difference in Reliability in Content Analysis

    Directory of Open Access Journals (Sweden)

    Elizabeth Murphy

    2005-07-01

    Full Text Available This paper reports on a case study which identifies and illustrates sources of difference in agreement in relation to reliability in a context of quantitative content analysis of a transcript of an online asynchronous discussion (OAD. Transcripts of 10 students in a month-long online asynchronous discussion were coded by two coders using an instrument with two categories, five processes, and 19 indicators of Problem Formulation and Resolution (PFR. Sources of difference were identified in relation to: coders; tasks; and students. Reliability values were calculated at the levels of categories, processes, and indicators. At the most detailed level of coding on the basis of the indicator, findings revealed that the overall level of reliability between coders was .591 when measured with Cohen’s kappa. The difference between tasks at the same level ranged from .349 to .664, and the difference between participants ranged from .390 to .907. Implications for training and research are discussed.

  1. Proteogenomic Analysis Identifies a Novel Human SHANK3 Isoform

    Directory of Open Access Journals (Sweden)

    Fahad Benthani

    2015-05-01

    Full Text Available Mutations of the SHANK3 gene have been associated with autism spectrum disorder. Individuals harboring different SHANK3 mutations display considerable heterogeneity in their cognitive impairment, likely due to the high SHANK3 transcriptional diversity. In this study, we report a novel interaction between the Mutated in colorectal cancer (MCC protein and a newly identified SHANK3 protein isoform in human colon cancer cells and mouse brain tissue. Hence, our proteogenomic analysis identifies a new human long isoform of the key synaptic protein SHANK3 that was not predicted by the human reference genome. Taken together, our findings describe a potential new role for MCC in neurons, a new human SHANK3 long isoform and, importantly, highlight the use of proteomic data towards the re-annotation of GC-rich genomic regions.

  2. A COMPARISION OF VARIOUS EDGE DETECTION TECHNIQUES IN MOTION PICTURE FOR IDENTIFYING A SHARK FISH

    Directory of Open Access Journals (Sweden)

    Shrivakshan Gopal Thiruvangadan

    2013-01-01

    Full Text Available The significant feature of detecting the motion image objects in this study it try identify the shark fish videos by removing the Background of the image. The main method involved in the detecting from the background is the foreground detection of the image. There are many techniques which usually ignore the fact that the background images consist of different image objects whose conditions may mostly change occur. In this study, a motion picture identification procedure is proposed for real time motion video frames by comparing the three key classes of methods for motion detection primarily the Background Removal (Subtraction followed by the Temporal distinguishing (differencing and Optical Flow method. Structured hierarchical background procedure is proposed based on segmenting background images objects. It mainly divided the background images divided into several parts (regions by the Support Vector Machine (SVM followed by a structured hierarchical model is built with the region procedure and pixel model procedure. In the region model method, the image object is extracted from the histogram of specific parts which is same to the kind of a Gaussian-combination model. In the pixel model procedure, it is been demonstrated by histograms, picture, which shows gradients sample of pixels in each parts based on the concurrent occurrence of object variations. In this study, it suggests Silhouette detection procedure and it is used. The experimental result are counter validated with a video database to illustrate its efficiencies, which is involved, from static to dynamic scenes by analyzing it with some distinguished motion detection methods chiefly Temporal differencing method followed by Optical Flow method and based on the outputs a motion detection procedure for real time video frames can be created which is cost effective, it shows good rate of accuracy, which is less rate of reliability in simple, less of complexity and well adapted to several

  3. Efficient Isothermal Titration Calorimetry Technique Identifies Direct Interaction of Small Molecule Inhibitors with the Target Protein.

    Science.gov (United States)

    Gal, Maayan; Bloch, Itai; Shechter, Nelia; Romanenko, Olga; Shir, Ofer M

    2016-01-01

    Protein-protein interactions (PPI) play a critical role in regulating many cellular processes. Finding novel PPI inhibitors that interfere with specific binding of two proteins is considered a great challenge, mainly due to the complexity involved in characterizing multi-molecular systems and limited understanding of the physical principles governing PPIs. Here we show that the combination of virtual screening techniques, which are capable of filtering a large library of potential small molecule inhibitors, and a unique secondary screening by isothermal titration calorimetry, a label-free method capable of observing direct interactions, is an efficient tool for finding such an inhibitor. In this study we applied this strategy in a search for a small molecule capable of interfering with the interaction of the tumor-suppressor p53 and the E3-ligase MDM2. We virtually screened a library of 15 million small molecules that were filtered to a final set of 80 virtual hits. Our in vitro experimental assay, designed to validate the activity of mixtures of compounds by isothermal titration calorimetry, was used to identify an active molecule against MDM2. At the end of the process the small molecule (4S,7R)-4-(4-chlorophenyl)-5-hydroxy-2,7-dimethyl-N-(6-methylpyridin-2-yl)-4,6,7,8 tetrahydrIoquinoline-3-carboxamide was found to bind MDM2 with a dissociation constant of ~2 µM. Following the identification of this single bioactive compound, spectroscopic measurements were used to further characterize the interaction of the small molecule with the target protein. 2D NMR spectroscopy was used to map the binding region of the small molecule, and fluorescence polarization measurement confirmed that it indeed competes with p53.

  4. A Technique for Tracking the Reading Rate to Identify the E-Book Reading Behaviors and Comprehension Outcomes of Elementary School Students

    Science.gov (United States)

    Huang, Yueh-Min; Liang, Tsung-Ho

    2015-01-01

    Tracking individual reading behaviors is a difficult task, as is carrying out real-time recording and analysis throughout the reading process, but these aims are worth pursuing. In this study, the reading rate is adopted as an indicator to identify different reading behaviors and comprehension outcomes. A reading rate tracking technique is thus…

  5. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  6. Analysis of Hospital Processes with Process Mining Techniques.

    Science.gov (United States)

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  7. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  8. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  9. Cluster analysis of clinical data identifies fibromyalgia subgroups.

    Directory of Open Access Journals (Sweden)

    Elisa Docampo

    Full Text Available INTRODUCTION: Fibromyalgia (FM is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. MATERIAL AND METHODS: 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. RESULTS: VARIABLES CLUSTERED INTO THREE INDEPENDENT DIMENSIONS: "symptomatology", "comorbidities" and "clinical scales". Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1, high symptomatology and comorbidities (Cluster 2, and high symptomatology but low comorbidities (Cluster 3, showing differences in measures of disease severity. CONCLUSIONS: We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment.

  10. Analytical techniques in pharmaceutical analysis: A review

    Directory of Open Access Journals (Sweden)

    Masoom Raza Siddiqui

    2017-02-01

    Full Text Available The development of the pharmaceuticals brought a revolution in human health. These pharmaceuticals would serve their intent only if they are free from impurities and are administered in an appropriate amount. To make drugs serve their purpose various chemical and instrumental methods were developed at regular intervals which are involved in the estimation of drugs. These pharmaceuticals may develop impurities at various stages of their development, transportation and storage which makes the pharmaceutical risky to be administered thus they must be detected and quantitated. For this analytical instrumentation and methods play an important role. This review highlights the role of the analytical instrumentation and the analytical methods in assessing the quality of the drugs. The review highlights a variety of analytical techniques such as titrimetric, chromatographic, spectroscopic, electrophoretic, and electrochemical and their corresponding methods that have been applied in the analysis of pharmaceuticals.

  11. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  12. Identifying coordinative structure using principal component analysis based on coherence derived from linear systems analysis.

    Science.gov (United States)

    Wang, Xinguang; O'Dwyer, Nicholas; Halaki, Mark; Smith, Richard

    2013-01-01

    Principal component analysis is a powerful and popular technique for capturing redundancy in muscle activity and kinematic patterns. A primary limitation of the correlations or covariances between signals on which this analysis is based is that they do not account for dynamic relations between signals, yet such relations-such as that between neural drive and muscle tension-are widespread in the sensorimotor system. Low correlations may thus be obtained and signals may appear independent despite a dynamic linear relation between them. To address this limitation, linear systems analysis can be used to calculate the matrix of overall coherences between signals, which measures the strength of the relation between signals taking dynamic relations into account. Using ankle, knee, and hip sagittal-plane angles from 6 healthy subjects during ~50% of total variance in the data set, while with overall coherence matrices the first component accounted for > 95% of total variance. The results demonstrate that the dimensionality of the coordinative structure can be overestimated using conventional correlation, whereas a more parsimonious structure is identified with overall coherence.

  13. Analysis of an Image Secret Sharing Scheme to Identify Cheaters

    Directory of Open Access Journals (Sweden)

    Jung-San LEe

    2010-09-01

    Full Text Available Secret image sharing mechanisms have been widely applied to the military, e-commerce, and communications fields. Zhao et al. introduced the concept of cheater detection into image sharing schemes recently. This functionality enables the image owner and authorized members to identify the cheater in reconstructing the secret image. Here, we provide an analysis of Zhao et al.¡¦s method: an authorized participant is able to restore the secret image by him/herself. This contradicts the requirement of secret image sharing schemes. The authorized participant utilizes an exhaustive search to achieve the attempt, though, simulation results show that it can be done within a reasonable time period.

  14. Identifying avian sources of faecal contamination using sterol analysis.

    Science.gov (United States)

    Devane, Megan L; Wood, David; Chappell, Andrew; Robson, Beth; Webster-Brown, Jenny; Gilpin, Brent J

    2015-10-01

    Discrimination of the source of faecal pollution in water bodies is an important step in the assessment and mitigation of public health risk. One tool for faecal source tracking is the analysis of faecal sterols which are present in faeces of animals in a range of distinctive ratios. Published ratios are able to discriminate between human and herbivore mammal faecal inputs but are of less value for identifying pollution from wildfowl, which can be a common cause of elevated bacterial indicators in rivers and streams. In this study, the sterol profiles of 50 avian-derived faecal specimens (seagulls, ducks and chickens) were examined alongside those of 57 ruminant faeces and previously published sterol profiles of human wastewater, chicken effluent and animal meatwork effluent. Two novel sterol ratios were identified as specific to avian faecal scats, which, when incorporated into a decision tree with human and herbivore mammal indicative ratios, were able to identify sterols from avian-polluted waterways. For samples where the sterol profile was not consistent with herbivore mammal or human pollution, avian pollution is indicated when the ratio of 24-ethylcholestanol/(24-ethylcholestanol + 24-ethylcoprostanol + 24-ethylepicoprostanol) is ≥0.4 (avian ratio 1) and the ratio of cholestanol/(cholestanol + coprostanol + epicoprostanol) is ≥0.5 (avian ratio 2). When avian pollution is indicated, further confirmation by targeted PCR specific markers can be employed if greater confidence in the pollution source is required. A 66% concordance between sterol ratios and current avian PCR markers was achieved when 56 water samples from polluted waterways were analysed.

  15. Cluster Analysis of Clinical Data Identifies Fibromyalgia Subgroups

    Science.gov (United States)

    Docampo, Elisa; Collado, Antonio; Escaramís, Geòrgia; Carbonell, Jordi; Rivera, Javier; Vidal, Javier; Alegre, José

    2013-01-01

    Introduction Fibromyalgia (FM) is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. Material and Methods 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. Results Variables clustered into three independent dimensions: “symptomatology”, “comorbidities” and “clinical scales”. Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1), high symptomatology and comorbidities (Cluster 2), and high symptomatology but low comorbidities (Cluster 3), showing differences in measures of disease severity. Conclusions We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment. PMID:24098674

  16. Archetypal TRMM Radar Profiles Identified Through Cluster Analysis

    Science.gov (United States)

    Boccippio, Dennis J.

    2003-01-01

    It is widely held that identifiable 'convective regimes' exist in nature, although precise definitions of these are elusive. Examples include land / Ocean distinctions, break / monsoon beahvior, seasonal differences in the Amazon (SON vs DJF), etc. These regimes are often described by differences in the realized local convective spectra, and measured by various metrics of convective intensity, depth, areal coverage and rainfall amount. Objective regime identification may be valuable in several ways: regimes may serve as natural 'branch points' in satellite retrieval algorithms or data assimilation efforts; one example might be objective identification of regions that 'should' share a similar 2-R relationship. Similarly, objectively defined regimes may provide guidance on optimal siting of ground validation efforts. Objectively defined regimes could also serve as natural (rather than arbitrary geographic) domain 'controls' in studies of convective response to environmental forcing. Quantification of convective vertical structure has traditionally involved parametric study of prescribed quantities thought to be important to convective dynamics: maximum radar reflectivity, cloud top height, 30-35 dBZ echo top height, rain rate, etc. Individually, these parameters are somewhat deficient as their interpretation is often nonunique (the same metric value may signify different physics in different storm realizations). Individual metrics also fail to capture the coherence and interrelationships between vertical levels available in full 3-D radar datasets. An alternative approach is discovery of natural partitions of vertical structure in a globally representative dataset, or 'archetypal' reflectivity profiles. In this study, this is accomplished through cluster analysis of a very large sample (0[107) of TRMM-PR reflectivity columns. Once achieved, the rainconditional and unconditional 'mix' of archetypal profile types in a given location and/or season provides a description

  17. Network Analysis Identifies Disease-Specific Pathways for Parkinson's Disease.

    Science.gov (United States)

    Monti, Chiara; Colugnat, Ilaria; Lopiano, Leonardo; Chiò, Adriano; Alberio, Tiziana

    2016-12-21

    Neurodegenerative diseases are characterized by the progressive loss of specific neurons in selected regions of the central nervous system. The main clinical manifestation (movement disorders, cognitive impairment, and/or psychiatric disturbances) depends on the neuron population being primarily affected. Parkinson's disease is a common movement disorder, whose etiology remains mostly unknown. Progressive loss of dopaminergic neurons in the substantia nigra causes an impairment of the motor control. Some of the pathogenetic mechanisms causing the progressive deterioration of these neurons are not specific for Parkinson's disease but are shared by other neurodegenerative diseases, like Alzheimer's disease and amyotrophic lateral sclerosis. Here, we performed a meta-analysis of the literature of all the quantitative proteomic investigations of neuronal alterations in different models of Parkinson's disease, Alzheimer's disease, and amyotrophic lateral sclerosis to distinguish between general and Parkinson's disease-specific pattern of neurodegeneration. Then, we merged proteomics data with genetics information from the DisGeNET database. The comparison of gene and protein information allowed us to identify 25 proteins involved uniquely in Parkinson's disease and we verified the alteration of one of them, i.e., transaldolase 1 (TALDO1), in the substantia nigra of 5 patients. By using open-source bioinformatics tools, we identified the biological processes specifically affected in Parkinson's disease, i.e., proteolysis, mitochondrion organization, and mitophagy. Eventually, we highlighted four cellular component complexes mostly involved in the pathogenesis: the proteasome complex, the protein phosphatase 2A, the chaperonins CCT complex, and the complex III of the respiratory chain.

  18. Social network analysis in identifying influential webloggers: A preliminary study

    Science.gov (United States)

    Hasmuni, Noraini; Sulaiman, Nor Intan Saniah; Zaibidi, Nerda Zura

    2014-12-01

    In recent years, second generation of internet-based services such as weblog has become an effective communication tool to publish information on the Web. Weblogs have unique characteristics that deserve users' attention. Some of webloggers have seen weblogs as appropriate medium to initiate and expand business. These webloggers or also known as direct profit-oriented webloggers (DPOWs) communicate and share knowledge with each other through social interaction. However, survivability is the main issue among DPOW. Frequent communication with influential webloggers is one of the way to keep survive as DPOW. This paper aims to understand the network structure and identify influential webloggers within the network. Proper understanding of the network structure can assist us in knowing how the information is exchanged among members and enhance survivability among DPOW. 30 DPOW were involved in this study. Degree centrality and betweenness centrality measurement in Social Network Analysis (SNA) were used to examine the strength relation and identify influential webloggers within the network. Thus, webloggers with the highest value of these measurements are considered as the most influential webloggers in the network.

  19. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  20. Using text-mining techniques in electronic patient records to identify ADRs from medicine use

    DEFF Research Database (Denmark)

    Warrer, Pernille; Hansen, Ebba Holme; Jensen, Lars Juhl

    2012-01-01

    This literature review included studies that use text-mining techniques in narrative documents stored in electronic patient records (EPRs) to investigate ADRs. We searched PubMed, Embase, Web of Science and International Pharmaceutical Abstracts without restrictions from origin until July 2011. We......, medicines associated with ADRs, text-mining algorithms used and their performance. Seven studies, all from the United States, were eligible for inclusion in the review. Studies were published from 2001, the majority between 2009 and 2010. Text-mining techniques varied over time from simple free text...... searching of outpatient visit notes and inpatient discharge summaries to more advanced techniques involving natural language processing (NLP) of inpatient discharge summaries. Performance appeared to increase with the use of NLP, although many ADRs were still missed. Due to differences in study design...

  1. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  2. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  3. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  4. Intelligent Technique for Signal Processing to Identify the Brain Disorder for Epilepsy Captures Using Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Gurumurthy Sasikumar

    2016-01-01

    Full Text Available The new direction of understand the signal that is created from the brain organization is one of the main chores in the brain signal processing. Amid all the neurological disorders the human brain epilepsy is measured as one of the extreme prevalent and then programmed artificial intelligence detection technique is an essential due to the crooked and unpredictable nature of happening of epileptic seizures. We proposed an Improved Fuzzy firefly algorithm, which would enhance the classification of the brain signal efficiently with minimum iteration. An important bunching technique created on fuzzy logic is the Fuzzy C means. Together in the feature domain with the spatial domain the features gained after multichannel EEG signals remained combined by means of fuzzy algorithms. And for better precision segmentation process the firefly algorithm is applied to optimize the Fuzzy C-means membership function. Simultaneously for the efficient clustering method the convergence criteria are set. On the whole the proposed technique yields more accurate results and that gives an edge over other techniques. This proposed algorithm result compared with other algorithms like fuzzy c means algorithm and PSO algorithm.

  5. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  6. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  7. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  8. Global secretome analysis identifies novel mediators of bone metastasis

    Institute of Scientific and Technical Information of China (English)

    Mario Andres Blanco; Gary LeRoy; Zia Khan; Ma(s)a Ale(c)kovi(c); Barry M Zee; Benjamin A Garcia; Yibin Kang

    2012-01-01

    Bone is the one of the most common sites of distant metastasis of solid tumors.Secreted proteins are known to influence pathological interactions between metastatic cancer cells and the bone stroma.To comprehensively profile secreted proteins associated with bone metastasis,we used quantitative and non-quantitative mass spectrometry to globally analyze the secretomes of nine cell lines of varying bone metastatic ability from multiple species and cancer types.By comparing the secretomes of parental cells and their bone metastatic derivatives,we identified the secreted proteins that were uniquely associated with bone metastasis in these cell lines.We then incorporated bioinformatic analyses of large clinical metastasis datasets to obtain a list of candidate novel bone metastasis proteins of several functional classes that were strongly associated with both clinical and experimental bone metastasis.Functional validation of selected proteins indicated that in vivo bone metastasis can be promoted by high expression of (1) the salivary cystatins CST1,CST2,and CST4; (2) the plasminogen activators PLAT and PLAU; or (3) the collagen functionality proteins PLOD2 and COL6A1.Overall,our study has uncovered several new secreted mediators of bone metastasis and therefore demonstrated that secretome analysis is a powerful method for identification of novel biomarkers and candidate therapeutic targets.

  9. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  10. ION COMPOSITION ELUCIDATION (ICE): A HIGH RESOLUTION MASS SPECTROMETRIC TECHNIQUE FOR IDENTIFYING COMPOUNDS IN COMPLEX MIXTURES

    Science.gov (United States)

    When tentatively identifying compounds in complex mixtures using mass spectral libraries, multiple matches or no plausible matches due to a high level of chemical noise or interferences can occur. Worse yet, most analytes are not in the libraries. In each case, Ion Composition El...

  11. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  12. Real-time analysis application for identifying bursty local areas related to emergency topics.

    Science.gov (United States)

    Sakai, Tatsuhiro; Tamura, Keiichi

    2015-01-01

    Since social media started getting more attention from users on the Internet, social media has been one of the most important information source in the world. Especially, with the increasing popularity of social media, data posted on social media sites are rapidly becoming collective intelligence, which is a term used to refer to new media that is displacing traditional media. In this paper, we focus on geotagged tweets on the Twitter site. These geotagged tweets are referred to as georeferenced documents because they include not only a short text message, but also the documents' posting time and location. Many researchers have been tackling the development of new data mining techniques for georeferenced documents to identify and analyze emergency topics, such as natural disasters, weather, diseases, and other incidents. In particular, the utilization of geotagged tweets to identify and analyze natural disasters has received much attention from administrative agencies recently because some case studies have achieved compelling results. In this paper, we propose a novel real-time analysis application for identifying bursty local areas related to emergency topics. The aim of our new application is to provide new platforms that can identify and analyze the localities of emergency topics. The proposed application is composed of three core computational intelligence techniques: the Naive Bayes classifier technique, the spatiotemporal clustering technique, and the burst detection technique. Moreover, we have implemented two types of application interface: a Web application interface and an android application interface. To evaluate the proposed application, we have implemented a real-time weather observation system embedded the proposed application. we used actual crawling geotagged tweets posted on the Twitter site. The weather observation system successfully detected bursty local areas related to observed emergency weather topics.

  13. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  14. Identifying desertification risk areas using fuzzy membership and geospatial technique – A case study, Kota District, Rajasthan

    Indian Academy of Sciences (India)

    Arunima Dasgupta; K L N Sastry; P S Dhinwa; V S Rathore; M S Nathawat

    2013-08-01

    Desertification risk assessment is important in order to take proper measures for its prevention. Present research intends to identify the areas under risk of desertification along with their severity in terms of degradation in natural parameters. An integrated model with fuzzy membership analysis, fuzzy rulebased inference system and geospatial techniques was adopted, including five specific natural parameters namely slope, soil pH, soil depth, soil texture and NDVI. Individual parameters were classified according to their deviation from mean. Membership of each individual values to be in a certain class was derived using the normal probability density function of that class. Thus if a single class of a single parameter is with mean and standard deviation , the values falling beyond + 2 and − 2 are not representing that class, but a transitional zone between two subsequent classes. These are the most important areas in terms of degradation, as they have the lowest probability to be in a certain class, hence highest probability to be extended or narrowed down in next or previous class respectively. Eventually, these are the values which can be easily altered, under extrogenic influences, hence are identified as risk areas. The overall desertification risk is derived by incorporating the different risk severity of each parameter using fuzzy rule-based interference system in GIS environment. Multicriteria based geo-statistics are applied to locate the areas under different severity of desertification risk. The study revealed that in Kota, various anthropogenic pressures are accelerating land deterioration, coupled with natural erosive forces. Four major sources of desertification in Kota are, namely Gully and Ravine erosion, inappropriate mining practices, growing urbanization and random deforestation.

  15. Application of an improved cDNA competition technique to identify prostate cancer-associated gene.

    Science.gov (United States)

    Rinaldy, A R; Steiner, M S

    1999-11-01

    A technique to improve cDNA library screening was developed by using mixed probes derived from two closely related cDNA populations of high-metastatic MAT-LyLu and low-metastatic AT-1 Dunning R3227 rat prostate cancer sublines. The technique required the generation of a cDNA library from each subline followed by polymerase chain reaction (PCR) amplification of the cDNA insert population. The PCR products derived from the first library were radiolabeled and mixed with an excess amount of PCR products from the second library. The mixture and an excess amount of both the lambda and pBluescript DNA were used as a probe to screen the first cDNA library. This mixed probe (designated the competition probe) differentially cross-hybridized with the plaque lift of the screened first cDNA library. Weak radioactive signals indicated the cross-hybridization of cDNA sequences common to the competition probe mixture and the first cDNA library, whereas strong signals implied unhybridized unique or abundant cDNA sequences in the first cDNA library. The reproducibility of this technique was confirmed by showing that the full-length cDNA clones were associated with the phenotype of the screened first cell line. The isolated clones were characterized as rat nucleolar protein, rat mitochondrial genes coding for 16S and 12S rRNAs, and rat tRNAs specific for valine and phenyl-alanine. This result is consistent with the fact that the first cell line, MAT-LyLu, is metabolically more active than are AT-1 cells because of higher gene dosage or amplification of nucleolar and mitochondrial RNA and its associated genes. Another clone which had a strong signal represented a novel gene associated with the MAT-LyLu cancer phenotype.

  16. Trials to identify irradiated chestnut (Castanea bungena) with different analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chung, H.-W. E-mail: chunghw@kfda.go.kr; Delincee, Henry; Han, S.-B.; Hong, J.-H.; Kim, H.-Y.; Kim, M.-C.; Byun, M.-W.; Kwon, J.-H

    2004-10-01

    Photostimulated luminescence (PSL) measurement, DNA comet assay, electron spin resonance (ESR) spectroscopy and thermoluminescence (TL) measurement were applied to identify irradiated chestnut. Samples were irradiated with {sup 60}Co {gamma}-rays at 0-0.5 kGy. The PSL photon counts for irradiated chestnuts were too low to be distinguished from those of the non-irradiated sample. There was no difference in DNA comets between non-irradiated and irradiated chestnuts. ESR spectroscopy did not show any radiation-induced specific signals but a symmetric singlet. However, using TL, the shape of the glow curve (Glow 1) made it possible to identify the irradiated chestnuts. In addition, the TL glow ratio (Glow 1/Glow 2) obtained by normalization was less than 0.01 for the non-irradiated sample and {>=}0.10 for irradiated ones, respectively.

  17. BaTMAn: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  18. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  19. Image processing techniques for identifying Mycobacterium tuberculosis in Ziehl-Neelsen stains.

    Science.gov (United States)

    Sadaphal, P; Rao, J; Comstock, G W; Beg, M F

    2008-05-01

    Worldwide, laboratory technicians tediously read sputum smears for tuberculosis (TB) diagnosis. We demonstrate proof of principle of an innovative computational algorithm that successfully recognizes Ziehl-Neelsen (ZN) stained acid-fast bacilli (AFB) in digital images. Automated, multi-stage, color-based Bayesian segmentation identified possible 'TB objects', removed artifacts by shape comparison and color-labeled objects as 'definite', 'possible' or 'non-TB', bypassing photomicrographic calibration. Superimposed AFB clusters, extreme stain variation and low depth of field were challenges. Our novel method facilitates electronic diagnosis of TB, permitting wider application in developing countries where fluorescent microscopy is currently inaccessible and unaffordable. We plan refinement and validation in the future.

  20. Combining digital watermarking and fingerprinting techniques to identify copyrights for color images.

    Science.gov (United States)

    Hsieh, Shang-Lin; Chen, Chun-Che; Shen, Wen-Shan

    2014-01-01

    This paper presents a copyright identification scheme for color images that takes advantage of the complementary nature of watermarking and fingerprinting. It utilizes an authentication logo and the extracted features of the host image to generate a fingerprint, which is then stored in a database and also embedded in the host image to produce a watermarked image. When a dispute over the copyright of a suspect image occurs, the image is first processed by watermarking. If the watermark can be retrieved from the suspect image, the copyright can then be confirmed; otherwise, the watermark then serves as the fingerprint and is processed by fingerprinting. If a match in the fingerprint database is found, then the suspect image will be considered a duplicated one. Because the proposed scheme utilizes both watermarking and fingerprinting, it is more robust than those that only adopt watermarking, and it can also obtain the preliminary result more quickly than those that only utilize fingerprinting. The experimental results show that when the watermarked image suffers slight attacks, watermarking alone is enough to identify the copyright. The results also show that when the watermarked image suffers heavy attacks that render watermarking incompetent, fingerprinting can successfully identify the copyright, hence demonstrating the effectiveness of the proposed scheme.

  1. The use of deconvolution techniques to identify the fundamental mixing characteristics of urban drainage structures.

    Science.gov (United States)

    Stovin, V R; Guymer, I; Chappell, M J; Hattersley, J G

    2010-01-01

    Mixing and dispersion processes affect the timing and concentration of contaminants transported within urban drainage systems. Hence, methods of characterising the mixing effects of specific hydraulic structures are of interest to drainage network modellers. Previous research, focusing on surcharged manholes, utilised the first-order Advection-Dispersion Equation (ADE) and Aggregated Dead Zone (ADZ) models to characterise dispersion. However, although systematic variations in travel time as a function of discharge and surcharge depth have been identified, the first order ADE and ADZ models do not provide particularly good fits to observed manhole data, which means that the derived parameter values are not independent of the upstream temporal concentration profile. An alternative, more robust, approach utilises the system's Cumulative Residence Time Distribution (CRTD), and the solute transport characteristics of a surcharged manhole have been shown to be characterised by just two dimensionless CRTDs, one for pre- and the other for post-threshold surcharge depths. Although CRTDs corresponding to instantaneous upstream injections can easily be generated using Computational Fluid Dynamics (CFD) models, the identification of CRTD characteristics from non-instantaneous and noisy laboratory data sets has been hampered by practical difficulties. This paper shows how a deconvolution approach derived from systems theory may be applied to identify the CRTDs associated with urban drainage structures.

  2. Combining Digital Watermarking and Fingerprinting Techniques to Identify Copyrights for Color Images

    Directory of Open Access Journals (Sweden)

    Shang-Lin Hsieh

    2014-01-01

    Full Text Available This paper presents a copyright identification scheme for color images that takes advantage of the complementary nature of watermarking and fingerprinting. It utilizes an authentication logo and the extracted features of the host image to generate a fingerprint, which is then stored in a database and also embedded in the host image to produce a watermarked image. When a dispute over the copyright of a suspect image occurs, the image is first processed by watermarking. If the watermark can be retrieved from the suspect image, the copyright can then be confirmed; otherwise, the watermark then serves as the fingerprint and is processed by fingerprinting. If a match in the fingerprint database is found, then the suspect image will be considered a duplicated one. Because the proposed scheme utilizes both watermarking and fingerprinting, it is more robust than those that only adopt watermarking, and it can also obtain the preliminary result more quickly than those that only utilize fingerprinting. The experimental results show that when the watermarked image suffers slight attacks, watermarking alone is enough to identify the copyright. The results also show that when the watermarked image suffers heavy attacks that render watermarking incompetent, fingerprinting can successfully identify the copyright, hence demonstrating the effectiveness of the proposed scheme.

  3. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  4. Perceived Effectiveness of Identified Methods and Techniques Teachers Adopt in Prose Literature Lessons in some Secondary Schools in Owerri

    Directory of Open Access Journals (Sweden)

    F. O. Ezeokoli

    2016-07-01

    Full Text Available The study determined the methods adopted by teachers in prose literature-in-English classrooms, activities of teachers and students, teachers’ perceived effectiveness of techniques used. It also examined the objectives of teaching prose literature that teachers should address and the extent teachers believe in student-identified difficulties of studying prose literature. The study adopted the descriptive survey research design. Purposive sampling technique was used to select 85 schools in Owerri metropolis and in each school, all literature teachers of senior secondary I and II were involved. In all, 246 literature teachers participated out of which 15 were purposively selected for observation. The two instruments were: Teachers’ Questionnaire (r = 0.87 and Classroom Observation Schedule (r = 0.73. Data were analysed using frequency counts and percentages. Results revealed that teachers adopted lecture (28.4%, reading (10.9% and discussion (7.3% methods. Teacher’s activities during the lesson include: giving background information, summarizing, dictating notes, reading aloud and explaining and asking questions. The adopted techniques include: questioning, oral reading, silent reading and discussion. Teachers’ perceived questioning as the most effective technique followed by debating and summarizing. Teachers identified development of students’ critical faculties and analytical skills, literary appreciation and language skills to be of utmost concern. It was concluded that the methods adopted by teachers are not diverse enough to cater for the needs and backgrounds of students. Keywords: Methods, Techniques, Perceived Effectiveness, Objectives, Literature-in-English

  5. Identifying Ecosystem Services of Rivers and Streams Through Content Analysis

    Science.gov (United States)

    While much ecosystem services research focuses on analysis such as mapping and/or valuation, fewer research efforts are directed toward in-depth understanding of the specific ecological quantities people value. Ecosystem service monitoring and analysis efforts and communications ...

  6. Toxigenic Vibrio cholerae identified in estuaries of Tanzania using PCR techniques.

    Science.gov (United States)

    Dalusi, Lucy; Lyimo, Thomas J; Lugomela, Charles; Hosea, Ken M M; Sjöling, Sara

    2015-03-01

    The current study assessed the occurrence of the Vibrio cholerae serogroups O1 and O139 in environmental samples along salinity gradients in three selected estuaries of Tanzania both through culture independent methods and by cultured bacteria. Occurrence of V. cholerae was determined by PCR targeting the V. cholerae outer membrane protein gene ompW. Furthermore, the presence of toxigenic strains and serogroups O1 and O139 was determined using multiplex PCR with specific primers targeting the cholera toxin gene subunit A, ctxA, and serotype specific primers, O1-rfb and O139-rfb, respectively. Results showed that V. cholerae occurred in approximately 10% (n = 185) of both the environmental samples and isolated bacteria. Eight of the bacteria isolates (n = 43) were confirmed as serogroup O1 while one belonged to serogroup O139, the first reported identification of this epidemic strain in East African coastal waters. All samples identified as serogroup O1 or O139 and a number of non-O1/O139 strains were ctxA positive. This study provides in situ evidence of the presence of pathogenic V. cholerae O1 and O139 and a number of V. cholerae non-O1/O139 that carry the cholera toxin gene in estuaries along the coast of Tanzania.

  7. A computational technique to identify the optimal stiffness matrix for a discrete nuclear fuel assembly model

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu, E-mail: nkpark@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Kyoung-Joo, E-mail: kyoungjoo@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Kyoung-Hong, E-mail: kyounghong@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Suh, Jung-Min, E-mail: jmsuh@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of)

    2013-02-15

    Highlights: ► An identification method of the optimal stiffness matrix for a fuel assembly structure is discussed. ► The least squares optimization method is introduced, and a closed form solution of the problem is derived. ► The method can be expanded to the system with the limited number of modes. ► Identification error due to the perturbed mode shape matrix is analyzed. ► Verification examples show that the proposed procedure leads to a reliable solution. -- Abstract: A reactor core structural model which is used to evaluate the structural integrity of the core contains nuclear fuel assembly models. Since the reactor core consists of many nuclear fuel assemblies, the use of a refined fuel assembly model leads to a considerable amount of computing time for performing nonlinear analyses such as the prediction of seismic induced vibration behaviors. The computational time could be reduced by replacing the detailed fuel assembly model with a simplified model that has fewer degrees of freedom, but the dynamic characteristics of the detailed model must be maintained in the simplified model. Such a model based on an optimal design method is proposed in this paper. That is, when a mass matrix and a mode shape matrix are given, the optimal stiffness matrix of a discrete fuel assembly model can be estimated by applying the least squares minimization method. The verification of the method is completed by comparing test results and simulation results. This paper shows that the simplified model's dynamic behaviors are quite similar to experimental results and that the suggested method is suitable for identifying reliable mathematical model for fuel assemblies.

  8. Classification Techniques for Multivariate Data Analysis.

    Science.gov (United States)

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  9. Investigation into the use of the CUSUM technique in identifying changes in mean air pollution levels following introduction of a traffic management scheme

    Science.gov (United States)

    Barratt, Benjamin; Atkinson, Richard; Ross Anderson, H.; Beevers, Sean; Kelly, Frank; Mudway, Ian; Wilkinson, Paul

    There is an increasing need for statistical techniques to identify and quantify the effects of traffic management schemes on ambient pollution levels. Cumulative sum (CUSUM) charts have been used extensively in industrial process control to detect deviations in production parameters from pre-determined values. This study investigates the use of the CUSUM procedure to identify change in ambient air pollution levels following the introduction of a traffic management scheme at a specific location in Central London. The CUSUM methods of Lucas first compute the standardised deviations of time series observations from the desired process mean. These are accumulated over time to compute the CUSUM at each time point. Data for the analysis were taken from a kerbside monitoring site on Marylebone Road, a six lane trunk route in Central London. In August 2001 the lane adjacent to the monitoring site was designated as a permanent bus lane. The CUSUM analysis clearly identifies a sustained decrease in carbon monoxide concentrations beginning in 2002. However, seasonality and other factors precluded precise characterisation of the timing of the change. When the analysis was repeated using a reference mean that extrapolated the pre-intervention trend in carbon monoxide concentrations, the CUSUM chart no longer identified a sustained decrease. CUSUM appears to offer a simple and rapid method for identifying sustained changes in pollution levels, but the range of confounding influences on carbon monoxide concentrations, most notably underlying trends, seasonality and independent interventions, complicate its interpretation. Its application in assessing the presence or timing of a stepped change in pollution or similar environmental time series data is recommended in its basic form only where the predicted change is large by comparison with other independent influences. The authors believe that further development of the technique beyond this initial study is worthwhile in order to

  10. A Novel Technique for Identifying Patients with ICU Needs Using Hemodynamic Features

    Directory of Open Access Journals (Sweden)

    A. Jalali

    2012-01-01

    Full Text Available Identification of patients requiring intensive care is a critical issue in clinical treatment. The objective of this study is to develop a novel methodology using hemodynamic features for distinguishing such patients requiring intensive care from a group of healthy subjects. In this study, based on the hemodynamic features, subjects are divided into three groups: healthy, risky and patient. For each of the healthy and patient subjects, the evaluated features are based on the analysis of existing differences between hemodynamic variables: Blood Pressure and Heart Rate. Further, four criteria from the hemodynamic variables are introduced: circle criterion, estimation error criterion, Poincare plot deviation, and autonomic response delay criterion. For each of these criteria, three fuzzy membership functions are defined to distinguish patients from healthy subjects. Furthermore, based on the evaluated criteria, a scoring method is developed. In this scoring method membership degree of each subject is evaluated for the three classifying groups. Then, for each subject, the cumulative sum of membership degree of all four criteria is calculated. Finally, a given subject is classified with the group which has the largest cumulative sum. In summary, the scoring method results in 86% sensitivity, 94.8% positive predictive accuracy and 82.2% total accuracy.

  11. Trends and Techniques in Visual Gaze Analysis

    CERN Document Server

    Stellmach, Sophie; Dachselt, Raimund; Lindley, Craig A

    2010-01-01

    Visualizing gaze data is an effective way for the quick interpretation of eye tracking results. This paper presents a study investigation benefits and limitations of visual gaze analysis among eye tracking professionals and researchers. The results were used to create a tool for visual gaze analysis within a Master's project.

  12. Analysis of Gopher Tortoise Population Estimation Techniques

    Science.gov (United States)

    2005-10-01

    terrestrial reptile that was once found throughout the southeastern United States from North Carolina into Texas. However, due to numerous factors...et al. 2000, Waddle 2000). Solar energy is used for thermoregulation and egg incubation. Also, tortoises are grazers (Garner and Landers 1981...Evaluation and review of field techniques used to study and manage gopher tortoises.” Pages 205-215 in Management of amphibians, reptiles , and small mammals

  13. Use of Photogrammetry and Biomechanical Gait analysis to Identify Individuals

    DEFF Research Database (Denmark)

    Larsen, Peter Kastmand; Simonsen, Erik Bruun; Lynnerup, Niels

    Photogrammetry and recognition of gait patterns are valuable tools to help identify perpetrators based on surveillance recordings. We have found that stature but only few other measures have a satisfying reproducibility for use in forensics. Several gait variables with high recognition rates were...

  14. Similarity transformation approach to identifiability analysis of nonlinear compartmental models.

    Science.gov (United States)

    Vajda, S; Godfrey, K R; Rabitz, H

    1989-04-01

    Through use of the local state isomorphism theorem instead of the algebraic equivalence theorem of linear systems theory, the similarity transformation approach is extended to nonlinear models, resulting in finitely verifiable sufficient and necessary conditions for global and local identifiability. The approach requires testing of certain controllability and observability conditions, but in many practical examples these conditions prove very easy to verify. In principle the method also involves nonlinear state variable transformations, but in all of the examples presented in the paper the transformations turn out to be linear. The method is applied to an unidentifiable nonlinear model and a locally identifiable nonlinear model, and these are the first nonlinear models other than bilinear models where the reason for lack of global identifiability is nontrivial. The method is also applied to two models with Michaelis-Menten elimination kinetics, both of considerable importance in pharmacokinetics, and for both of which the complicated nature of the algebraic equations arising from the Taylor series approach has hitherto defeated attempts to establish identifiability results for specific input functions.

  15. Identifying failure mechanisms in LDMOS transistors by analytical stability analysis

    NARCIS (Netherlands)

    Ferrara, A.; Steeneken, P.G.; Boksteen, B.K.; Heringa, A.; Scholten, A.J.; Schmitz, J.; Hueting, R.J.E.

    2014-01-01

    In this work, analytical stability equations are derived and combined with a physics-based model of an LDMOS transistor in order to identify the primary cause of failure in different operating and bias conditions. It is found that there is a gradual boundary between an electrical failure region at h

  16. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  17. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  18. Identifying news clusters using Q-analysis and modularity

    OpenAIRE

    2013-01-01

    With online publication and social media taking the main role in dissemination of news, and with the decline of traditional printed media, it has become necessary to devise ways to automatically extract meaningful information from the plethora of sources available and to make that information readily available to interested parties. In this paper we present a method of automated analysis of the underlying structure of online newspapers based on Q-analysis and modularity. We show how the combi...

  19. Identifying Innovative Interventions to Promote Healthy Eating Using Consumption-Oriented Food Supply Chain Analysis.

    Science.gov (United States)

    Hawkes, Corinna

    2009-07-01

    The mapping and analysis of supply chains is a technique increasingly used to address problems in the food system. Yet such supply chain management has not yet been applied as a means of encouraging healthier diets. Moreover, most policies recommended to promote healthy eating focus on the consumer end of the chain. This article proposes a consumption-oriented food supply chain analysis to identify the changes needed in the food supply chain to create a healthier food environment, measured in terms of food availability, prices, and marketing. Along with established forms of supply chain analysis, the method is informed by a historical overview of how food supply chains have changed over time. The method posits that the actors and actions in the chain are affected by organizational, financial, technological, and policy incentives and disincentives, which can in turn be levered for change. It presents a preliminary example of the supply of Coca-Cola beverages into school vending machines and identifies further potential applications. These include fruit and vegetable supply chains, local food chains, supply chains for health-promoting versions of food products, and identifying financial incentives in supply chains for healthier eating.

  20. A multiway analysis for identifying high integrity bovine BACs

    OpenAIRE

    McEwan John C; Brauning Rudiger; McWilliam Sean; Barris Wesley; Ratnakumar Abhirami; Snelling Warren M; Dalrymple Brian P

    2009-01-01

    Abstract Background In large genomics projects involving many different types of analyses of bacterial artificial chromosomes (BACs), such as fingerprinting, end sequencing (BES) and full BAC sequencing there are many opportunities for the identities of BACs to become confused. However, by comparing the results from the different analyses, inconsistencies can be identified and a set of high integrity BACs preferred for future research can be defined. Results The location of each bovine BAC in...

  1. Identifiability analysis of the CSTR river water quality model.

    Science.gov (United States)

    Chen, J; Deng, Y

    2006-01-01

    Conceptual river water quality models are widely known to lack identifiability. The causes for that can be due to model structure errors, observational errors and less frequent samplings. Although significant efforts have been directed towards better identification of river water quality models, it is not clear whether a given model is structurally identifiable. Information is also limited regarding the contribution of different unidentifiability sources. Taking the widely applied CSTR river water quality model as an example, this paper presents a theoretical proof that the CSTR model is indeed structurally identifiable. Its uncertainty is thus dominantly from observational errors and less frequent samplings. Given the current monitoring accuracy and sampling frequency, the unidentifiability from sampling frequency is found to be more significant than that from observational errors. It is also noted that there is a crucial sampling frequency between 0.1 and 1 day, over which the simulated river system could be represented by different illusions and the model application could be far less reliable.

  2. Uncertainty Analysis Technique for OMEGA Dante Measurements

    Energy Technology Data Exchange (ETDEWEB)

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  3. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  4. Using Link Analysis Technique with a Modified Shortest-Path Algorithm to Fight Money Laundering

    Institute of Scientific and Technical Information of China (English)

    CHEN Yunkai; MAI Quanwen; LU Zhengding

    2006-01-01

    Effective link analysis techniques are needed to help law enforcement and intelligence agencies fight money laundering.This paper presents a link analysis technique that uses a modified shortest-path algorithms to identify the strongest association paths between entities in a money laundering network.Based on two-tree Dijkstra and Priority-First-Search (PFS) algorithm, a modified algorithm is presented.To apply the algorithm, a network representation transformation is made first.

  5. Pressure transient analysis for long homogeneous reservoirs using TDS technique

    Energy Technology Data Exchange (ETDEWEB)

    Escobar, Freddy Humberto [Universidad Surcolombiana, Av. Pastrana - Cra. 1, Neiva, Huila (Colombia); Hernandez, Yuly Andrea [Hocol S.A., Cra. 7 No 114-43, Floor 16, Bogota (Colombia); Hernandez, Claudia Marcela [Weatherford, Cra. 7 No 81-90, Neiva, Huila (Colombia)

    2007-08-15

    A significant number of well pressure tests are conducted in long, narrow reservoirs with close and open extreme boundaries. It is desirable not only to appropriately identify these types of systems but also to develop an adequate and practical interpretation technique to determine their parameters and size, when possible. An accurate understanding of how the reservoir produces and the magnitude of producible reserves can lead to competent decisions and adequate reservoir management. So far, studies found for identification and determination of parameters for such systems are conducted by conventional techniques (semilog analysis) and semilog and log-log type-curve matching of pressure versus time. Type-curve matching is basically a trial-and-error procedure which may provide inaccurate results. Besides, a limitation in the number of type curves plays a negative role. In this paper, a detailed analysis of pressure derivative behavior for a vertical well in linear reservoirs with open and closed extreme boundaries is presented for the case of constant rate production. We studied independently each flow regime, especially the linear flow regime since it is the most characteristic 'fingerprint' of these systems. We found that when the well is located at one of the extremes of the reservoir, a single linear flow regime develops once radial flow and/or wellbore storage effects have ended. When the well is located at a given distance from both extreme boundaries, the pressure derivative permits the identification of two linear flows toward the well and it has been called that 'dual-linear flow regime'. This is characterized by an increment of the intercept of the 1/2-slope line from {pi}{sup 0.5} to {pi} with a consequent transition between these two straight lines. The identification of intersection points, lines, and characteristic slopes allows us to develop an interpretation technique without employing type-curve matching. This technique uses

  6. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  7. Using Factor Analysis to Identify Topic Preferences Within MBA Courses

    Directory of Open Access Journals (Sweden)

    Earl Chrysler

    2003-02-01

    Full Text Available This study demonstrates the role of a principal components factor analysis in conducting a gap analysis as to the desired characteristics of business alumni. Typically, gap analyses merely compare the emphases that should be given to areas of inquiry with perceptions of actual emphases. As a result, the focus is upon depth of coverage. A neglected area in need of investigation is the breadth of topic dimensions and their differences between the normative (should offer and the descriptive (actually offer. The implications of factor structures, as well as traditional gap analyses, are developed and discussed in the context of outcomes assessment.

  8. Comparison of Hydrogen Sulfide Analysis Techniques

    Science.gov (United States)

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  9. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  10. Use of fuzzy techniques for analysis of dynamic loads in power systems

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Presents the use of fuzzy techniques for analysis of dynamic load characteristics of power systems to identify the voltage stability (collapse) of a weak bus and concludes from the consistent results obtained that this is a useful tool for analysis of load charactersitics of sophiscated power systems and their components.

  11. Identifying key parameters to differentiate groundwater flow systems using multifactorial analysis

    Science.gov (United States)

    Menció, Anna; Folch, Albert; Mas-Pla, Josep

    2012-11-01

    SummaryMultivariate techniques are useful in hydrogeological studies to reduce the complexity of large-scale data sets, and provide more understandable insight into the system hydrology. In this study, principal component analysis (PCA) has been used as an exploratory method to identify the key parameters that define distinct flow systems in the Selva basin (NE Spain). In this statistical analysis, all the information obtained in hydrogeological studies (that is, hydrochemical and isotopic data, but also potentiometric data) is used. Additionally, cluster analysis, based on PCA results, allows the associations between samples to be identified, and thus, corroborates the occurrence of different groundwater fluxes. PCA and cluster analysis reveal that two main groundwater flow systems exist in the Selva basin, each with distinct hydrochemical, isotopic, and potentiometric features. Regional groundwater fluxes are associated with high F- contents, and confined aquifer layers; while local fluxes are linked to nitrate polluted unconfined aquifers with a different recharge rates. In agreement with previous hydrogeological studies, these statistical methods stand as valid screening tools to highlight the fingerprint variables that can be used as indicators to facilitate further, more arduous, analytical approaches and a feasible interpretation of the whole data set.

  12. Identifying Effective Psychological Treatments of Insomnia: A Meta-Analysis.

    Science.gov (United States)

    Murtagh, Douglas R. R.; Greenwood, Kenneth M.

    1995-01-01

    Clarified efficacy of psychological treatments for insomnia through a meta-analysis of 66 outcome studies representing 139 treatment groups. Psychological treatments produced considerable enhancement of both sleep patterns and the subjective experience of sleep. Participants who were clinically referred and who did not regularly use sedatives…

  13. [THE COMPARATIVE ANALYSIS OF TECHNIQUES OF IDENTIFICATION OF CORYNEBACTERIUM NON DIPHTHERIAE].

    Science.gov (United States)

    Kharseeva, G G; Voronina, N A; Mironov, A Yu; Alutina, E L

    2015-12-01

    The comparative analysis was carried out concerning effectiveness of three techniques of identification of Corynebacterium non diphtheriae: bacteriological, molecular genetic (sequenation on 16SpRNA) andmass-spectrometric (MALDI-ToFMS). The analysis covered 49 strains of Corynebacterium non diphtheriae (C.pseudodiphheriticum, C.amycolatum, C.propinquum, C.falsenii) and 2 strains of Corynebacterium diphtheriae isolated under various pathology form urogenital tract and upper respiratory ways. The corinbacteria were identified using bacteriologic technique, sequenation on 16SpRNA and mass-spectrometric technique (MALDIToF MS). The full concordance of results of species' identification was marked in 26 (51%) of strains of Corynebacterium non diphtheriae at using three analysis techniques; in 43 (84.3%) strains--at comparison of bacteriologic technique with sequenation on 16S pRNA and in 29 (57%)--at mass-spectrometric analysis and sequenation on 16S pRNA. The bacteriologic technique is effective for identification of Corynebacterium diphtheriae. The precise establishment of species belonging of corynebacteria with variable biochemical characteristics the molecular genetic technique of analysis is to be applied. The mass-spectrometric technique (MALDI-ToF MS) requires further renewal of data bases for identifying larger spectrum of representatives of genus Corynebacterium.

  14. Asaia bogorensis peritonitis identified by 16S ribosomal RNA sequence analysis in a patient receiving peritoneal dialysis.

    Science.gov (United States)

    Snyder, Richard W; Ruhe, Jorg; Kobrin, Sidney; Wasserstein, Alan; Doline, Christa; Nachamkin, Irving; Lipschutz, Joshua H

    2004-08-01

    Here the authors report a case of refractory peritonitis leading to multiple hospitalizations and the loss of peritoneal dialysis access in a patient on automated peritoneal dialysis, caused by Asaia bogorensis, a bacterium not previously described as a human pathogen. This organism was identified by sequence analysis of the 16S ribosomal RNA gene. Unusual microbial agents may cause peritonitis, and molecular microbiological techniques are important tools for identifying these agents.

  15. Identifying energy saving opportunities in buildings by the analysis of time series data

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Vasco [ENERGAIA- Energy Management Agency of Gaia, Vila Nova de Gaia (Portugal); Fleming, Paul; Ajiboye, Paul [De Montfort Univ., Leicester (United Kingdom). Inst. of Energy and Sustainable Development

    2003-07-01

    This paper describes how the analysis of time series energy data can be used to identify energy saving opportunities in buildings. Using readily available historic records of energy consumption, particularly quarter-hourly and half-hourly electricity time-series data, and with basic knowledge of the building, potential energy saving opportunities can be highlighted. A review of energy monitoring and targeting (MandT) procedures and techniques of analysing time series data have been carried out. The analysis of electricity time series data has been reviewed in the UK. This has involved analysing demand for power in 8 office buildings. Three different analytical approaches have been applied, simple visualisation and interpretation of energy use patterns, contour mapping and recurrent cumulative sum deviation (CUSUM). This paper concludes that such approaches to analyse electrical power demand could increase the cost-effectiveness and reliability of energy audits and surveys.

  16. Book Review: Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects

    Directory of Open Access Journals (Sweden)

    Thomas Nash

    2013-06-01

    Full Text Available Shavers, B. (2013. Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects. Waltham, MA: Elsevier, 290 pages, ISBN-978-1-59749-985-9, US$51.56. Includes bibliographical references and index.Reviewed by Detective Corporal Thomas Nash (tnash@bpdvt.org, Burlington Vermont Police Department, Internet Crime against Children Task Force. Adjunct Instructor, Champlain College, Burlington VT.In this must read for any aspiring novice cybercrime investigator as well as the seasoned professional computer guru alike, Brett Shaver takes the reader into the ever changing and dynamic world of Cybercrime investigation.  Shaver, an experienced criminal investigator, lays out the details and intricacies of a computer related crime investigation in a clear and concise manner in his new easy to read publication, Placing the Suspect behind the Keyboard. Using Digital Forensics and Investigative techniques to Identify Cybercrime Suspects. Shaver takes the reader from start to finish through each step of the investigative process in well organized and easy to follow sections, with real case file examples to reach the ultimate goal of any investigation: identifying the suspect and proving their guilt in the crime. Do not be fooled by the title. This excellent, easily accessible reference is beneficial to both criminal as well as civil investigations and should be in every investigator’s library regardless of their respective criminal or civil investigative responsibilities.(see PDF for full review

  17. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  18. A comparison of wavelet analysis techniques in digital holograms

    Science.gov (United States)

    Molony, Karen M.; Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.; Naughton, Thomas J.

    2008-04-01

    This study explores the effectiveness of wavelet analysis techniques on digital holograms of real-world 3D objects. Stationary and discrete wavelet transform techniques have been applied for noise reduction and compared. Noise is a common problem in image analysis and successful reduction of noise without degradation of content is difficult to achieve. These wavelet transform denoising techniques are contrasted with traditional noise reduction techniques; mean filtering, median filtering, Fourier filtering. The different approaches are compared in terms of speckle reduction, edge preservation and resolution preservation.

  19. An analysis technique for microstrip antennas

    Science.gov (United States)

    Agrawal, P. K.; Bailey, M. C.

    1977-01-01

    The paper presents a combined numerical and empirical approach to the analysis of microstrip antennas over a wide range of frequencies. The method involves representing the antenna by a fine wire grid immersed in a dielectric medium and then using Richmond's reaction formulation (1974) to evaluate the piecewise sinusoidal currents on the grid segments. The calculated results are then modified to account for the finite dielectric discontinuity. The method is applied to round and square microstrip antennas.

  20. Temperature-based Instanton Analysis: Identifying Vulnerability in Transmission Networks

    Energy Technology Data Exchange (ETDEWEB)

    Kersulis, Jonas [Univ. of Michigan, Ann Arbor, MI (United States); Hiskens, Ian [Univ. of Michigan, Ann Arbor, MI (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bienstock, Daniel [Columbia Univ., New York, NY (United States)

    2015-04-08

    A time-coupled instanton method for characterizing transmission network vulnerability to wind generation fluctuation is presented. To extend prior instanton work to multiple-time-step analysis, line constraints are specified in terms of temperature rather than current. An optimization formulation is developed to express the minimum wind forecast deviation such that at least one line is driven to its thermal limit. Results are shown for an IEEE RTS-96 system with several wind-farms.

  1. Compartmental analysis of dynamic nuclear medicine data: models and identifiability

    Science.gov (United States)

    Delbary, Fabrice; Garbarino, Sara; Vivaldi, Valentina

    2016-12-01

    Compartmental models based on tracer mass balance are extensively used in clinical and pre-clinical nuclear medicine in order to obtain quantitative information on tracer metabolism in the biological tissue. This paper is the first of a series of two that deal with the problem of tracer coefficient estimation via compartmental modelling in an inverse problem framework. Specifically, here we discuss the identifiability problem for a general n-dimension compartmental system and provide uniqueness results in the case of two-compartment and three-compartment compartmental models. The second paper will utilize this framework in order to show how nonlinear regularization schemes can be applied to obtain numerical estimates of the tracer coefficients in the case of nuclear medicine data corresponding to brain, liver and kidney physiology.

  2. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  3. Messina: a novel analysis tool to identify biologically relevant molecules in disease.

    Directory of Open Access Journals (Sweden)

    Mark Pinese

    Full Text Available BACKGROUND: Morphologically similar cancers display heterogeneous patterns of molecular aberrations and follow substantially different clinical courses. This diversity has become the basis for the definition of molecular phenotypes, with significant implications for therapy. Microarray or proteomic expression profiling is conventionally employed to identify disease-associated genes, however, traditional approaches for the analysis of profiling experiments may miss molecular aberrations which define biologically relevant subtypes. METHODOLOGY/PRINCIPAL FINDINGS: Here we present Messina, a method that can identify those genes that only sometimes show aberrant expression in cancer. We demonstrate with simulated data that Messina is highly sensitive and specific when used to identify genes which are aberrantly expressed in only a proportion of cancers, and compare Messina to contemporary analysis techniques. We illustrate Messina by using it to detect the aberrant expression of a gene that may play an important role in pancreatic cancer. CONCLUSIONS/SIGNIFICANCE: Messina allows the detection of genes with profiles typical of markers of molecular subtype, and complements existing methods to assist the identification of such markers. Messina is applicable to any global expression profiling data, and to allow its easy application has been packaged into a freely-available stand-alone software package.

  4. New techniques for emulsion analysis in a hybrid experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodama, K. (Aichi University of Education, Kariya 448 (Japan)); Ushida, N. (Aichi University of Education, Kariya 448 (Japan)); Mokhtarani, A. (University of California (Davis), Davis, CA 95616 (United States)); Paolone, V.S. (University of California (Davis), Davis, CA 95616 (United States)); Volk, J.T. (University of California (Davis), Davis, CA 95616 (United States)); Wilcox, J.O. (University of California (Davis), Davis, CA 95616 (United States)); Yager, P.M. (University of California (Davis), Davis, CA 95616 (United States)); Edelstein, R.M. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Freyberger, A.P. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Gibaut, D.B. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Lipton, R.J. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Nichols, W.R. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Potter, D.M. (Carnegie-Mellon Univers

    1994-08-01

    A new method, called graphic scanning, was developed by the Nagoya University Group for emulsion analysis in a hybrid experiment. This method enhances both speed and reliability of emulsion analysis. Details of the application of this technique to the analysis of Fermilab experiment E653 are described. ((orig.))

  5. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  6. Meconium microbiome analysis identifies bacteria correlated with premature birth.

    Directory of Open Access Journals (Sweden)

    Alexandria N Ardissone

    Full Text Available Preterm birth is the second leading cause of death in children under the age of five years worldwide, but the etiology of many cases remains enigmatic. The dogma that the fetus resides in a sterile environment is being challenged by recent findings and the question has arisen whether microbes that colonize the fetus may be related to preterm birth. It has been posited that meconium reflects the in-utero microbial environment. In this study, correlations between fetal intestinal bacteria from meconium and gestational age were examined in order to suggest underlying mechanisms that may contribute to preterm birth.Meconium from 52 infants ranging in gestational age from 23 to 41 weeks was collected, the DNA extracted, and 16S rRNA analysis performed. Resulting taxa of microbes were correlated to clinical variables and also compared to previous studies of amniotic fluid and other human microbiome niches.Increased detection of bacterial 16S rRNA in meconium of infants of <33 weeks gestational age was observed. Approximately 61·1% of reads sequenced were classified to genera that have been reported in amniotic fluid. Gestational age had the largest influence on microbial community structure (R = 0·161; p = 0·029, while mode of delivery (C-section versus vaginal delivery had an effect as well (R = 0·100; p = 0·044. Enterobacter, Enterococcus, Lactobacillus, Photorhabdus, and Tannerella, were negatively correlated with gestational age and have been reported to incite inflammatory responses, suggesting a causative role in premature birth.This provides the first evidence to support the hypothesis that the fetal intestinal microbiome derived from swallowed amniotic fluid may be involved in the inflammatory response that leads to premature birth.

  7. Cepstrum Analysis: An Advanced Technique in Vibration Analysis of Defects in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    M. Satyam

    1994-01-01

    Full Text Available Conventional frequency analysis in machinery vibration is not adequate to find out accurately defects in gears, bearings, and blades where sidebands and harmonics are present. Also such an approach is dependent on the transmission path. On the other hand, cepstrum analysis accurately identifies harmonics and sideband families and is a better technique available for fault diagnosis in gears, bearings, and turbine blades of ships and submarines. Cepstrum represents the global power content of a whole family of harmonics and sidebands when more than one family of sidebands are presents at the same time. Also it is insensitive to the transmission path effects since source and transmission path effects are additive and can be separated in cepstrum. The concept, underlying theory and the measurement and analysis involved for using the technique are briefly outlined. Two cases were taken to demonstrate advantage of cepstrum technique over the spectrum analysis. An LP compressor was chosen to study the transmission path effects and a marine gearbox having two sets of sideband families was studied to diagnose the problematic sideband and its severity.

  8. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  9. Mini-DIAL system measurements coupled with multivariate data analysis to identify TIC and TIM simulants: preliminary absorption database analysis.

    Science.gov (United States)

    Gaudio, P.; Malizia, A.; Gelfusa, M.; Martinelli, E.; Di Natale, C.; Poggi, L. A.; Bellecci, C.

    2017-01-01

    Nowadays Toxic Industrial Components (TICs) and Toxic Industrial Materials (TIMs) are one of the most dangerous and diffuse vehicle of contamination in urban and industrial areas. The academic world together with the industrial and military one are working on innovative solutions to monitor the diffusion in atmosphere of such pollutants. In this phase the most common commercial sensors are based on “point detection” technology but it is clear that such instruments cannot satisfy the needs of the smart cities. The new challenge is developing stand-off systems to continuously monitor the atmosphere. Quantum Electronics and Plasma Physics (QEP) research group has a long experience in laser system development and has built two demonstrators based on DIAL (Differential Absorption of Light) technology could be able to identify chemical agents in atmosphere. In this work the authors will present one of those DIAL system, the miniaturized one, together with the preliminary results of an experimental campaign conducted on TICs and TIMs simulants in cell with aim of use the absorption database for the further atmospheric an analysis using the same DIAL system. The experimental results are analysed with standard multivariate data analysis technique as Principal Component Analysis (PCA) to develop a classification model aimed at identifying organic chemical compound in atmosphere. The preliminary results of absorption coefficients of some chemical compound are shown together pre PCA analysis.

  10. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  11. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  12. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    Science.gov (United States)

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  13. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  14. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations, 2. Robustness of Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C.; Kleijnen, J.P.C.

    1999-03-24

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked.

  15. Analysis of Maize Crop Leaf using Multivariate Image Analysis for Identifying Soil Deficiency

    Directory of Open Access Journals (Sweden)

    S. Sridevy

    2014-11-01

    Full Text Available Image processing analysis for the soil deficiency identification has become an active area of research in this study. The changes in the color of the leaves are used to analyze and identify the deficiency of soil nutrients such as Nitrogen (N, Phosphorus (P and potassium (K by digital color image analysis. This research study focuses on the image analysis of the maize crop leaf using multivariate image analysis. In this proposed novel approach, initially, a color transformation for the input RGB image is formed and this RGB is converted to HSV because RGB is ideal for color generation but HSV is very suitable for color perception. Then green pixels are masked and removed using specific threshold value by applying histogram equalization. This masking approach is done through specific customized filtering approach which exclusively filters the green color of the leaf. After the filtering step, only the deficiency part of the leaf is taken for consideration. Then, a histogram generation is carried out for the deficiency part of the leaf. Then, Multivariate Image Analysis approach using Independent Component Analysis (ICA is carried out to extract a reference eigenspace from a matrix built by unfolding color data from the deficiency part. Test images are also unfolded and projected onto the reference eigenspace and the result is a score matrix which is used to compute nutrient deficiency based on the T2 statistic. In addition, a multi-resolution scheme by scaling down process is carried out to speed up the process. Finally, based on the training samples, the soil deficiency is identified based on the color of the maize crop leaf.

  16. Design, data analysis and sampling techniques for clinical research

    OpenAIRE

    Karthik Suresh; Sanjeev V Thomas; Geetha Suresh

    2011-01-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains...

  17. Comparative study of Authorship Identification Techniques for Cyber Forensics Analysis

    Directory of Open Access Journals (Sweden)

    Smita Nirkhi

    2013-06-01

    Full Text Available Authorship Identification techniques are used to identify the most appropriate author from group of potential suspects of online messages and find evidences to support the conclusion. Cybercriminals make misuse of online communication for sending blackmail or a spam email and then attempt to hide their true identities to void detection.Authorship Identification of online messages is the contemporary research issue for identity tracing in cyber forensics. This is highly interdisciplinary area as it takes advantage of machine learning, information retrieval, and natural language processing. In this paper, a study of recent techniques and automated approaches to attributing authorship of online messages is presented. The focus of this review study is to summarize all existing authorship identification techniques used in literature to identify authors of online messages. Also it discusses evaluation criteria and parameters for authorship attribution studies and list open questions that will attract future work in this area.

  18. Methylation Linear Discriminant Analysis (MLDA for identifying differentially methylated CpG islands

    Directory of Open Access Journals (Sweden)

    Vass J Keith

    2008-08-01

    Full Text Available Abstract Background Hypermethylation of promoter CpG islands is strongly correlated to transcriptional gene silencing and epigenetic maintenance of the silenced state. As well as its role in tumor development, CpG island methylation contributes to the acquisition of resistance to chemotherapy. Differential Methylation Hybridisation (DMH is one technique used for genome-wide DNA methylation analysis. The study of such microarray data sets should ideally account for the specific biological features of DNA methylation and the non-symmetrical distribution of the ratios of unmethylated and methylated sequences hybridised on the array. We have therefore developed a novel algorithm tailored to this type of data, Methylation Linear Discriminant Analysis (MLDA. Results MLDA was programmed in R (version 2.7.0 and the package is available at CRAN 1. This approach utilizes linear regression models of non-normalised hybridisation data to define methylation status. Log-transformed signal intensities of unmethylated controls on the microarray are used as a reference. The signal intensities of DNA samples digested with methylation sensitive restriction enzymes and mock digested are then transformed to the likelihood of a locus being methylated using this reference. We tested the ability of MLDA to identify loci differentially methylated as analysed by DMH between cisplatin sensitive and resistant ovarian cancer cell lines. MLDA identified 115 differentially methylated loci and 23 out of 26 of these loci have been independently validated by Methylation Specific PCR and/or bisulphite pyrosequencing. Conclusion MLDA has advantages for analyzing methylation data from CpG island microarrays, since there is a clear rational for the definition of methylation status, it uses DMH data without between-group normalisation and is less influenced by cross-hybridisation of loci. The MLDA algorithm successfully identified differentially methylated loci between two classes of

  19. Memory Forensics: Review of Acquisition and Analysis Techniques

    Science.gov (United States)

    2013-11-01

    types of digital evidence investigated include images, text, video and audio files [1]. To date, digital forensic investigations have focused on the...UNCLASSIFIED Memory Forensics : Review of Acquisition and Analysis Techniques Grant Osborne Cyber and Electronic Warfare Division Defence Science and...Technology Organisation DSTO–GD–0770 ABSTRACT This document presents an overview of the most common memory forensics techniques used in the

  20. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  1. Analysis On Classification Techniques In Mammographic Mass Data Set

    OpenAIRE

    K.K.Kavitha; Dr.A.Kangaiammal

    2015-01-01

    Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such a...

  2. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  3. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  4. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  5. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  6. Parallelization of events generation for data analysis techniques

    CERN Document Server

    Lazzaro, A

    2010-01-01

    With the startup of the LHC experiments at CERN, the involved community is now focusing on the analysis of the collected data. The complexity of the data analyses will be a key factor for finding eventual new phenomena. For such a reason many data analysis tools have been developed in the last several years, which implement several data analysis techniques. Goal of these techniques is the possibility of discriminating events of interest and measuring parameters on a given input sample of events, which are themselves defined by several variables. Also particularly important is the possibility of repeating the determination of the parameters by applying the procedure on several simulated samples, which are generated using Monte Carlo techniques and the knowledge of the probability density functions of the input variables. This procedure achieves a better estimation of the results. Depending on the number of variables, complexity of their probability density functions, number of events, and number of sample to g...

  7. Identifying and Prioritizing Effective Factors on Classifying A Private Bank Customers by Delphi Technique and Analytical Hierarchy Process (AHP

    Directory of Open Access Journals (Sweden)

    S. Khayatmoghadam

    2013-05-01

    Full Text Available Banking industry development and presence of different financial institutions cause to increase competition in customer and their capitals attraction so that there are about 28 banks and many credit and financial institutions from which 6 banks are public and 22 banks are private. Among them, public banks have a more appropriate situation than private banks with regard to governmental relations and support and due to geographical expansion and longer history. But due to lack of above conditions; private banks try to attract customers with regarding science areas to remedy this situation. Therefore, in this study we are decided to review banking customers from a different viewpoint. For this reason, we initially obtained ideal indications from banking viewpoint in two-story of uses and resources customers using experts and Delphi technique application which based on this, indicators such as account workflow, account average, lack of returned cheque, etc and in uses section, the amount of facility received, the amount of received warranties, etc, were determined. Then, using a Hierarchical Analysis (AHP method and experts opinions through software Expert Choice11, priority of these criteria were determined and weight of each index was determined. It should be noted that statistical population of bank experts associated with this study were queue and staff. Also obtained results can be used as input for customer grouping in line with CRM techniques implementation.

  8. Comparative analysis of methods for identifying recurrent copy number alterations in cancer.

    Directory of Open Access Journals (Sweden)

    Xiguo Yuan

    Full Text Available Recurrent copy number alterations (CNAs play an important role in cancer genesis. While a number of computational methods have been proposed for identifying such CNAs, their relative merits remain largely unknown in practice since very few efforts have been focused on comparative analysis of the methods. To facilitate studies of recurrent CNA identification in cancer genome, it is imperative to conduct a comprehensive comparison of performance and limitations among existing methods. In this paper, six representative methods proposed in the latest six years are compared. These include one-stage and two-stage approaches, working with raw intensity ratio data and discretized data respectively. They are based on various techniques such as kernel regression, correlation matrix diagonal segmentation, semi-parametric permutation and cyclic permutation schemes. We explore multiple criteria including type I error rate, detection power, Receiver Operating Characteristics (ROC curve and the area under curve (AUC, and computational complexity, to evaluate performance of the methods under multiple simulation scenarios. We also characterize their abilities on applications to two real datasets obtained from cancers with lung adenocarcinoma and glioblastoma. This comparison study reveals general characteristics of the existing methods for identifying recurrent CNAs, and further provides new insights into their strengths and weaknesses. It is believed helpful to accelerate the development of novel and improved methods.

  9. Analysing Java Identifier Names

    OpenAIRE

    Butler, Simon

    2016-01-01

    Identifier names are the principal means of recording and communicating ideas in source code and are a significant source of information for software developers and maintainers, and the tools that support their work. This research aims to increase understanding of identifier name content types - words, abbreviations, etc. - and phrasal structures - noun phrases, verb phrases, etc. - by improving techniques for the analysis of identifier names. The techniques and knowledge acquired can be appl...

  10. An Electrochemical Impedance Spectroscopy-Based Technique to Identify and Quantify Fermentable Sugars in Pineapple Waste Valorization for Bioethanol Production

    Directory of Open Access Journals (Sweden)

    Claudia Conesa

    2015-09-01

    Full Text Available Electrochemical Impedance Spectroscopy (EIS has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%. These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring.

  11. An Electrochemical Impedance Spectroscopy-Based Technique to Identify and Quantify Fermentable Sugars in Pineapple Waste Valorization for Bioethanol Production.

    Science.gov (United States)

    Conesa, Claudia; García-Breijo, Eduardo; Loeff, Edwin; Seguí, Lucía; Fito, Pedro; Laguarda-Miró, Nicolás

    2015-09-11

    Electrochemical Impedance Spectroscopy (EIS) has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%). These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring.

  12. An Electrochemical Impedance Spectroscopy-Based Technique to Identify and Quantify Fermentable Sugars in Pineapple Waste Valorization for Bioethanol Production

    Science.gov (United States)

    Conesa, Claudia; García-Breijo, Eduardo; Loeff, Edwin; Seguí, Lucía; Fito, Pedro; Laguarda-Miró, Nicolás

    2015-01-01

    Electrochemical Impedance Spectroscopy (EIS) has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%). These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring. PMID:26378537

  13. Data analysis techniques for nuclear and particle physicists

    CERN Document Server

    Pruneau, Claude

    2017-01-01

    This is an advanced data analysis textbook for scientists specializing in the areas of particle physics, nuclear physics, and related subfields. As a practical guide for robust, comprehensive data analysis, it focuses on realistic techniques to explain instrumental effects. The topics are relevant for engineers, scientists, and astroscientists working in the fields of geophysics, chemistry, and the physical sciences. The book serves as a reference for more senior scientists while being eminently accessible to advanced undergraduate and graduate students.

  14. Automated local bright feature image analysis of nuclear proteindistribution identifies changes in tissue phenotype

    Energy Technology Data Exchange (ETDEWEB)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-02-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues.

  15. Optimization Techniques for Analysis of Biological and Social Networks

    Science.gov (United States)

    2012-03-28

    systematic fashion under a unifying theoretical and algorithmic framework . Optimization, Complex Networks, Social Network Analysis, Computational...analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms, test and fine...exact solutions are presented. In [3], we introduce the variable objective search framework for combinatorial optimization. The method utilizes

  16. What Child Analysis Can Teach Us about Psychoanalytic Technique.

    Science.gov (United States)

    Ablon, Steven Luria

    2014-01-01

    Child analysis has much to teach us about analytic technique. Children have an innate, developmentally driven sense of analytic process. Children in analysis underscore the importance of an understanding and belief in the therapeutic action of play, the provisional aspects of play, and that not all play will be understood. Each analysis requires learning a new play signature that is constantly reorganized. Child analysis emphasizes the emergence and integration of dissociated states, the negotiation of self-other relationships, the importance of co-creation, and the child's awareness of the analyst's sensibility. Child analysis highlights the robust nature of transference and how working through and repairing is related to the initiation of coordinated patterns of high predictability in the context of deep attachments. I will illustrate these and other ideas in the description of the analysis of a nine-year-old boy.

  17. VIBRATION ANALYSIS ON A COMPOSITE BEAM TO IDENTIFY DAMAGE AND DAMAGE SEVERITY USING FINITE ELEMENT METHOD

    Directory of Open Access Journals (Sweden)

    E.V.V.Ramanamurthy

    2011-07-01

    Full Text Available The objective of this paper is to develop a damage detection method in a composite cantilever beam with an edge crack has been studied using finite element method. A number of analytical, numerical andexperimental techniques are available for the study of damage identification in beams. Studies were carried out for three different types of analysis on a composite cantilever beam with an edge crack as damage. The material used in this analysis is glass-epoxy composite material. The finite element formulation was carried out in the analysis section of the package, known as ANSYS. The types of vibration analysis studied on a composite beam are Modal, Harmonic andTransient analysis. The crack is modeled such that the cantilever beam is replaced with two intact beams with the crack as additional boundary condition. Damage algorithms are used to identify and locate the damage. Damage index method is also used to find the severity of the damage. The results obtained from modal analysis were compared with the transient analysis results.The vibration-based damage detection methods are based on the fact that changes of physical properties (stiffness, mass and damping due to damage will manifest themselves as changes in the structural modal parameters (natural frequencies, mode shapes and modal damping. The task is then to monitor the selected indicators derived from modal parameters to distinguish between undamaged and damaged states. However, the quantitative changes of global modal parameters are not sufficiently sensitive to a local damage. The proposed approach, on the other hand, interprets the dynamic changes caused by damage in a different way. Although the basis for vibration-based damage detection appears intuitive, the implementation in real structures may encounter many significant challenges. The most fundamental issue is the fact that damage typically is a local phenomenon and may not dramatically influence the global dynamic response of a

  18. Developing techniques for cause-responsibility analysis of occupational accidents.

    Science.gov (United States)

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  19. Design, data analysis and sampling techniques for clinical research.

    Science.gov (United States)

    Suresh, Karthik; Thomas, Sanjeev V; Suresh, Geetha

    2011-10-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains various sampling methods that can be appropriately used in medical research with different scenarios and challenges.

  20. Metabolic Engineering: Techniques for analysis of targets for genetic manipulations

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1998-01-01

    of a given process requires analysis of the underlying mechanisms, at best, at the molecular level. To reveal these mechanisms a number of different techniques may be applied: (1) detailed physiological studies, (2) metabolic flux analysis (MFA), (3) metabolic control analysis (MCA), (4) thermodynamic......Metabolic engineering has been defined as the purposeful modification of intermediary metabolism using recombinant DNA techniques. With this definition metabolic engineering includes: (1) inserting new pathways in microorganisms with the aim of producing novel metabolites, e.g., production...... of polyketides by Streptomyces; (2) production of heterologous peptides, e.g., production of human insulin, erythropoitin, and tPA; and (3) improvement of both new and existing processes, e.g., production of antibiotics and industrial enzymes. Metabolic engineering is a multidisciplinary approach, which involves...

  1. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm......The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  2. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  3. A new chromosome fluorescence banding technique combining DAPI staining with image analysis in plants.

    Science.gov (United States)

    Liu, Jing Yu; She, Chao Wen; Hu, Zhong Li; Xiong, Zhi Yong; Liu, Li Hua; Song, Yun Chun

    2004-08-01

    In this study, a new chromosome fluorescence banding technique was developed in plants. The technique combined 4',6-diamidino-2-phenylindole (DAPI) staining with software analysis including three-dimensional imaging after deconvolution. Clear multiple and adjacent DAPI bands like G-bands were obtained by this technique in the tested species including Hordeum vulgare L., Oryza officinalis, Wall & Watt, Triticum aestivum L., Lilium brownii, Brown, and Vicia faba L. During mitotic metaphase, the numbers of bands for the haploid genomes of these species were about 185, 141, 309, 456 and 194, respectively. Reproducibility analysis demonstrated that banding patterns within a species were stable at the same mitotic stage and they could be used for identifying specific chromosomes and chromosome regions. The band number fluctuated: the earlier the mitotic stage, the greater the number of bands. The technique enables genes to be mapped onto specific band regions of the chromosomes by only one fluorescence in situ hybridisation (FISH) step with no chemical banding treatments. In this study, the 45S and 5S rDNAs of some tested species were located on specific band regions of specific chromosomes and they were all positioned at the interbands with the new technique. Because no chemical banding treatment was used, the banding patterns displayed by the technique should reflect the natural conformational features of chromatin. Thus it could be expected that this technique should be suitable for all eukaryotes and would have widespread utility in chromosomal structure analysis and physical mapping of genes.

  4. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  5. Evaluation of Damping Using Frequency Domain Operational Modal Analysis Techniques

    DEFF Research Database (Denmark)

    Bajric, Anela; Georgakis, Christos T.; Brincker, Rune

    2015-01-01

    Operational Modal Analysis (OMA) techniques provide in most cases reasonably accurate estimates of structural frequencies and mode shapes. In contrast though, they are known to often produce uncertain structural damping estimates, which is mainly due to inherent random and/or bias errors...... domain techniques, the Frequency Domain Decomposition (FDD) and the Frequency Domain Polyreference (FDPR). The response of a two degree-of-freedom (2DOF) system is numerically established with specified modal parameters subjected to white noise loading. The system identification is evaluated with well...

  6. Analysis On Classification Techniques In Mammographic Mass Data Set

    Directory of Open Access Journals (Sweden)

    Mrs. K. K. Kavitha

    2015-07-01

    Full Text Available Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such as Decision Tree Induction, Naïve Bayes , k-Nearest Neighbour (KNN classifiers in mammographic mass dataset.

  7. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    Science.gov (United States)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  8. Golden glazes analysis by PIGE and PIXE techniques

    Science.gov (United States)

    Fonseca, M.; Luís, H.; Franco, N.; Reis, M. A.; Chaves, P. C.; Taborda, A.; Cruz, J.; Galaviz, D.; Fernandes, N.; Vieira, P.; Ribeiro, J. P.; Jesus, A. P.

    2011-12-01

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 °C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  9. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  10. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  11. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  12. Large areas elemental mapping by ion beam analysis techniques

    Science.gov (United States)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  13. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  14. RCAUSE – A ROOT CAUSE ANALYSIS MODEL TO IDENTIFY THE ROOT CAUSES OF SOFTWARE REENGINEERING PROBLEMS

    Directory of Open Access Journals (Sweden)

    Er. Anand Rajavat

    2011-01-01

    Full Text Available Organizations that wish to modernize their legacy systems, must adopt a financial viable evolution strategy to gratify the needs of modern business environment. There are various options available to modernize legacy system in to more contemporary system. Over the last few years’ legacy system reengineering has emerged as a popular system modernization technique. The reengineering generally focuses on the increased productivity and quality of the system. However many of these efforts are often less than successful because they only concentrate on symptoms of software reengineering risk without targeting root causes of those risk. A subjective assessment (diagnosis of software reengineering risk from different domain of legacy system is required to identify the root causes of those risks. The goal of this paper is to highlight root causes of software reengineering risk. We proposed a root cause analysis model RCause that classify root causes of software reengineering risk in to three distinctive but connected areas of interest i.e. system domain, managerial domain and technical domain. .

  15. Systems analysis of quantitative shRNA-library screens identifies regulators of cell adhesion

    Directory of Open Access Journals (Sweden)

    Huang XiaoDong

    2008-06-01

    Full Text Available Abstract Background High throughput screens with RNA interference technology enable loss-of-function analyses of gene activities in mammalian cells. While the construction of genome-scale shRNA libraries has been successful, results of large-scale screening of those libraries can be difficult to analyze because of the relatively high noise levels and the fact that not all shRNAs in a library are equally effective in silencing gene expression. Results We have screened a library consisting of 43,828 shRNAs directed against 8,500 human genes for functions that are necessary in cell detachment induced by a constitutively activated c-Abl tyrosine kinase. To deal with the issues of noise and uncertainty of knockdown efficiencies, we employed an analytical strategy that combines quantitative data analysis with biological knowledge, i.e. Gene Ontology and pathway information, to increase the power of the RNAi screening technique. Using this strategy we found 16 candidate genes to be involved in Abl-induced disruption of cell adhesion, and verified that the knockdown of IL6ST is associated with enhanced cell attachment. Conclusion Our results suggest that the power of genome-wide quantitative shRNA screens can be significantly increased when analyzed using a systems biology-based approach to identify functional gene networks.

  16. Analysis of Acoustic Emission Signals using WaveletTransformation Technique

    Directory of Open Access Journals (Sweden)

    S.V. Subba Rao

    2008-07-01

    Full Text Available Acoustic emission (AE monitoring is carried out during proof pressure testing of pressurevessels to find the occurrence of any crack growth-related phenomenon. While carrying out AEmonitoring, it is often found that the background noise is very high. Along with the noise, thesignal includes various phenomena related to crack growth, rubbing of fasteners, leaks, etc. Dueto the presence of noise, it becomes difficult to identify signature of the original signals related to the above phenomenon. Through various filtering/ thresholding techniques, it was found that the original signals were getting filtered out along with noise. Wavelet transformation technique is found to be more appropriate to analyse the AE signals under such situations. Wavelet transformation technique is used to de-noise the AE data. The de-noised signal is classified to identify a signature based on the type of phenomena.Defence Science Journal, 2008, 58(4, pp.559-564, DOI:http://dx.doi.org/10.14429/dsj.58.1677

  17. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group...... variables. The two groups of methods are compared and the pros and cons are exemplied using dierent cases of simulated data. The results illustrate that the estimate of the covariance matrix is an important factor with respect to choice of method, and the choice of method should thus be driven by the nature...

  18. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  19. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    Science.gov (United States)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  20. Identifying At-Risk Students in General Chemistry via Cluster Analysis of Affective Characteristics

    Science.gov (United States)

    Chan, Julia Y. K.; Bauer, Christopher F.

    2014-01-01

    The purpose of this study is to identify academically at-risk students in first-semester general chemistry using affective characteristics via cluster analysis. Through the clustering of six preselected affective variables, three distinct affective groups were identified: low (at-risk), medium, and high. Students in the low affective group…

  1. Technique of Hadamard transform microscope fluorescence image analysis

    Institute of Scientific and Technical Information of China (English)

    梅二文; 顾文芳; 曾晓斌; 陈观铨; 曾云鹗

    1995-01-01

    Hadamard transform spatial multiplexed imaging technique is combined with fluorescence microscope and an instrument of Hadamard transform microscope fluorescence image analysis is developed. Images acquired by this instrument can provide a lot of useful information simultaneously, including three-dimensional Hadamard transform microscope cell fluorescence image, the fluorescence intensity and fluorescence distribution of a cell, the background signal intensity and the signal/noise ratio, etc.

  2. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis

    OpenAIRE

    Heidmann, Iris; Schade-Kampmann, Grit; Lambalk, Joep; Ottiger, Marcel; Di Berardino, Marco

    2016-01-01

    Introduction An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a comm...

  3. Calcium Hardness Analysis of Water Samples Using EDXRF Technique

    Directory of Open Access Journals (Sweden)

    Kanan Deep

    2014-08-01

    Full Text Available Calcium hardness of water samples has been determined using a method based upon the Energy Dispersive X-ray fluorescence (EDXRF technique for elemental analysis. The minimum detection limit for Ca has been found in the range 0.1-100ppm. The experimental approach and analytical method for calcium studies seem satisfactory for the purpose and can be utilized for similar investigations.

  4. Failure Analysis Seminar: Techniques and Teams. Seminar Notes. Volume I.

    Science.gov (United States)

    1981-01-01

    and Progress - Evaluate 7* 6 *~ 0 6 9 9 S 9 FAILURE ANALYSIS STRATEGY1 Augustine E. Magistro *. Introduction A primary task of management and systems...by Augustine Magistro , Picatinny Arsenal and Lawrence R. Seggel, U. S. Army Missile Command. The report Is available from the National Technical...to emphasize techniques - Identification and improvement of your leadership styles 2I BIOGRAPHIC SKETCHES: A.E. "Gus" Magistro - Systems Evaluation

  5. Analysis of diagnostic calorimeter data by the transfer function technique

    Science.gov (United States)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  6. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  7. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2015-11-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  8. Empirical Analysis of Data Mining Techniques for Social Network Websites

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2014-02-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  9. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  10. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori...... to analyze the uncertainty of model predictions. This allows judging the fitness of the model to the purpose under uncertainty. Hence we recommend uncertainty analysis as a proactive solution when faced with model uncertainty, which is the case for biofuel process development research....

  11. Using Quantitative Data Analysis Techniques for Bankruptcy Risk Estimation for Corporations

    Directory of Open Access Journals (Sweden)

    Ştefan Daniel ARMEANU

    2012-01-01

    Full Text Available Diversification of methods and techniques for quantification and management of risk has led to the development of many mathematical models, a large part of which focused on measuring bankruptcy risk for businesses. In financial analysis there are many indicators which can be used to assess the risk of bankruptcy of enterprises but to make an assessment it is needed to reduce the number of indicators and this can be achieved through principal component, cluster and discriminant analyses techniques. In this context, the article aims to build a scoring function used to identify bankrupt companies, using a sample of companies listed on Bucharest Stock Exchange.

  12. The potential of electroanalytical techniques in pharmaceutical analysis.

    Science.gov (United States)

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  13. Alternative to Ritt's Pseudodivision for finding the input-output equations in algebraic structural identifiability analysis

    CERN Document Server

    Meshkat, Nicolette; DiStefano, Joseph J

    2012-01-01

    Differential algebra approaches to structural identifiability analysis of a dynamic system model in many instances heavily depend upon Ritt's pseudodivision at an early step in analysis. The pseudodivision algorithm is used to find the characteristic set, of which a subset, the input-output equations, is used for identifiability analysis. A simpler algorithm is proposed for this step, using Gr\\"obner Bases, along with a proof of the method that includes a reduced upper bound on derivative requirements. Efficacy of the new algorithm is illustrated with two biosystem model examples.

  14. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  15. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    Directory of Open Access Journals (Sweden)

    Akshay Amolik

    2015-12-01

    Full Text Available Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment analysis is tricky as compared to broad sentiment analysis because of the slang words and misspellings and repeated characters. We know that the maximum length of each tweet in Twitter is 140 characters. So it is very important to identify correct sentiment of each word. In our project we are proposing a highly accurate model of sentiment analysis of tweets with respect to latest reviews of upcoming Bollywood or Hollywood movies. With the help of feature vector and classifiers such as Support vector machine and Naïve Bayes, we are correctly classifying these tweets as positive, negative and neutral to give sentiment of each tweet.

  16. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  17. Arc-length technique for nonlinear finite element analysis

    Institute of Scientific and Technical Information of China (English)

    MEMON Bashir-Ahmed; SU Xiao-zu(苏小卒)

    2004-01-01

    Nonlinear solution of reinforced concrete structures, particularly complete load-deflection response, requires tracing of the equilibrium path and proper treatment of the limit and bifurcation points. In this regard, ordinary solution techniques lead to instability near the limit points and also have problems in case of snap-through and snap-back. Thus they fail to predict the complete load-displacement response. The arc-length method serves the purpose well in principle, Received wide acceptance in finite element analysis, and has been used extensively. However modifications to the basic idea are vital to meet the particular needs of the analysis. This paper reviews some of the recent developments of the method in the last two decades, with particular emphasis on nonlinear finite element analysis of reinforced concrete structures.

  18. Requirements Analyses Integrating Goals and Problem Analysis Techniques

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    One of the difficulties that goal-oriented requirements analyses encounters is that the efficiency of the goal refinement is based on the analysts' subjective knowledge and experience. To improve the efficiency of the requirements eiicitation process, engineers need approaches with more systemized analysis techniques. This paper integrates the goal-oriented requirements language i* with concepts from a structured problem analysis notation, problem frames (PF). The PF approach analyzes software design as a contextualized problem which has to respond to constraints imposed by the environment. The proposed approach is illustrated using the meeting scheduler exemplar. Results show that integration of the goal and the problem analysis enables simultaneous consideration of the designer's subjective intentions and the physical environmental constraints.

  19. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  20. Application of thermal analysis techniques in activated carbon production

    Science.gov (United States)

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  1. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  2. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  3. An Empirical Analysis of Rough Set Categorical Clustering Techniques

    Science.gov (United States)

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy. PMID:28068344

  4. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    Science.gov (United States)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  5. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  6. Fourier transform infrared spectroscopy techniques for the analysis of drugs of abuse

    Science.gov (United States)

    Kalasinsky, Kathryn S.; Levine, Barry K.; Smith, Michael L.; Magluilo, Joseph J.; Schaefer, Teresa

    1994-01-01

    Cryogenic deposition techniques for Gas Chromatography/Fourier Transform Infrared (GC/FT-IR) can be successfully employed in urinalysis for drugs of abuse with detection limits comparable to those of the established Gas Chromatography/Mass Spectrometry (GC/MS) technique. The additional confidence of the data that infrared analysis can offer has been helpful in identifying ambiguous results, particularly, in the case of amphetamines where drugs of abuse can be confused with over-the-counter medications or naturally occurring amines. Hair analysis has been important in drug testing when adulteration of urine samples has been a question. Functional group mapping can further assist the analysis and track drug use versus time.

  7. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Science.gov (United States)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  8. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  9. Transcriptome Analysis of Syringa oblata Lindl. Inflorescence Identifies Genes Associated with Pigment Biosynthesis and Scent Metabolism.

    Directory of Open Access Journals (Sweden)

    Jian Zheng

    Full Text Available Syringa oblata Lindl. is a woody ornamental plant with high economic value and characteristics that include early flowering, multiple flower colors, and strong fragrance. Despite a long history of cultivation, the genetics and molecular biology of S. oblata are poorly understood. Transcriptome and expression profiling data are needed to identify genes and to better understand the biological mechanisms of floral pigments and scents in this species. Nine cDNA libraries were obtained from three replicates of three developmental stages: inflorescence with enlarged flower buds not protruded, inflorescence with corolla lobes not displayed, and inflorescence with flowers fully opened and emitting strong fragrance. Using the Illumina RNA-Seq technique, 319,425,972 clean reads were obtained and were assembled into 104,691 final unigenes (average length of 853 bp, 41.75% of which were annotated in the NCBI non-redundant protein database. Among the annotated unigenes, 36,967 were assigned to gene ontology categories and 19,956 were assigned to eukaryoticorthologous groups. Using the Kyoto Encyclopedia of Genes and Genomes pathway database, 12,388 unigenes were sorted into 286 pathways. Based on these transcriptomic data, we obtained a large number of candidate genes that were differentially expressed at different flower stages and that were related to floral pigment biosynthesis and fragrance metabolism. This comprehensive transcriptomic analysis provides fundamental information on the genes and pathways involved in flower secondary metabolism and development in S. oblata, providing a useful database for further research on S. oblata and other plants of genus Syringa.

  10. Dynamic Range Analysis of the Phase Generated Carrier Demodulation Technique

    Directory of Open Access Journals (Sweden)

    M. J. Plotnikov

    2014-01-01

    Full Text Available The dependence of the dynamic range of the phase generated carrier (PGC technique on low-pass filters passbands is investigated using a simulation model. A nonlinear character of this dependence, which could lead to dynamic range limitations or measurement uncertainty, is presented for the first time. A detailed theoretical analysis is provided to verify the simulation results and these results are consistent with performed calculations. The method for the calculation of low-pass filters passbands according to the required dynamic range upper limit is proposed.

  11. New technique for high-speed microjet breakup analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vago, N. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland); Spiegel, A. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Couty, P. [Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, Lausanne, BM, 1015, Lausanne (Switzerland); Wagner, F.R.; Richerzhagen, B. [Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland)

    2003-10-01

    In this paper we introduce a new technique for visualizing the breakup of thin high-speed liquid jets. Focused light of a He-Ne laser is coupled into a water jet, which behaves as a cylindrical waveguide until the point where the amplitude of surface waves is large enough to scatter out the light from the jet. Observing the jet from a direction perpendicular to its axis, the light that appears indicates the location of breakup. Real-time examination and also statistical analysis of the jet disruption is possible with this method. A ray tracing method was developed to demonstrate the light scattering process. (orig.)

  12. Quality assurance and quantitative error analysis by tracer techniques

    Energy Technology Data Exchange (ETDEWEB)

    Schuetze, N.; Hermann, U.

    1983-12-01

    The locations, types and sources of casting defects have been tested by tracer techniques. Certain sites of moulds were labelled using /sup 199/Au, /sup 24/Na sodium carbonate solution, and technetium solution produced in the technetium generator on a /sup 99/Mo//sup 99/Tc elution column. Evaluations were made by means of activity measurements and autoradiography. The locations and causes of casting defects can be determined by error analysis. The surface defects of castings resulting from the moulding materials and from the blacking can be detected by technetium, the subsurface defects are located by gold.

  13. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...... contents and the requirement for the project prior to its start are described together with thee results obtained during the 3 year period of the project. The project was mainly carried out as a Ph.D project by the first author from September 1994 to August 1997 in cooperation with associate professor Rune...

  14. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  15. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    Science.gov (United States)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  16. Golden glazes analysis by PIGE and PIXE techniques

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, M., E-mail: mmfonseca@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Luis, H., E-mail: heliofluis@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Franco, N., E-mail: nfranco@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Reis, M.A., E-mail: mareis@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Chaves, P.C., E-mail: cchaves@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Taborda, A., E-mail: galaviz@cii.fc.ul.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Cruz, J., E-mail: jdc@fct.unl.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Galaviz, D., E-mail: ataborda@itn.pt [Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Dept. Fisica, Faculdade de Ciencias, Universidade de Lisboa, Lisboa (Portugal); and others

    2011-12-15

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 Degree-Sign C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  17. International publications about sustainability: a review of articles using the technique of qualitative content analysis

    Directory of Open Access Journals (Sweden)

    Cristiane Froehlich

    2014-04-01

    Full Text Available This study aims to identify articles related to sustainability in international publications and analyze the categories that emerge from these studies through the use of the technique of qualitative content analysis in order to identify the main approaches, author contributions and theoretical gaps and suggestions for further studies. For application of the technique have been rising about 20 articles in Journals that feature sustainability impact factor relevant, and used the three basic steps for content analysis of the publications listed by Bardin (1977: (a pre-analysis; (b material exploration and; (c treatment data, inference and interpretation. The main results show that the major theories that are related to sustainability are: the resources and capabilities theory, institutional theory, stakeholders theory, the theory of market orientation, theory of supply chain and competitive advantage and both seek to help and explain the factors that facilitate or hinder the practice of corporate sustainability. In addition, some suggestions for further research were identified and analysis of the results presented in this study

  18. Genome-wide association study meta-analysis identifies seven new rheumatoid arthritis risk loci

    OpenAIRE

    Stahl, Eli A; Raychaudhuri, Soumya; Remmers, Elaine F.; Xie, Gang; Eyre, Stephen; Thomson, Brian P.; Li, Yonghong; Kurreeman, Fina A. S.; Zhernakova, Alexandra; Hinks, Anne; Guiducci, Candace; Chen, Robert; Alfredsson, Lars; Amos, Christopher I.; Ardlie, Kristin G.

    2010-01-01

    To identify novel genetic risk factors for rheumatoid arthritis (RA), we conducted a genome-wide association study (GWAS) meta-analysis of 5,539 autoantibody positive RA cases and 20,169 controls of European descent, followed by replication in an independent set of 6,768 RA cases and 8,806 controls. Of 34 SNPs selected for replication, 7 novel RA risk alleles were identified at genome-wide significance (P

  19. Comparative analysis of Salmonella genomes identifies a metabolic network for escalating growth in the inflamed gut.

    Science.gov (United States)

    Nuccio, Sean-Paul; Bäumler, Andreas J

    2014-03-18

    The Salmonella genus comprises a group of pathogens associated with illnesses ranging from gastroenteritis to typhoid fever. We performed an in silico analysis of comparatively reannotated Salmonella genomes to identify genomic signatures indicative of disease potential. By removing numerous annotation inconsistencies and inaccuracies, the process of reannotation identified a network of 469 genes involved in central anaerobic metabolism, which was intact in genomes of gastrointestinal pathogens but degrading in genomes of extraintestinal pathogens. This large network contained pathways that enable gastrointestinal pathogens to utilize inflammation-derived nutrients as well as many of the biochemical reactions used for the enrichment and biochemical discrimination of Salmonella serovars. Thus, comparative genome analysis identifies a metabolic network that provides clues about the strategies for nutrient acquisition and utilization that are characteristic of gastrointestinal pathogens. IMPORTANCE While some Salmonella serovars cause infections that remain localized to the gut, others disseminate throughout the body. Here, we compared Salmonella genomes to identify characteristics that distinguish gastrointestinal from extraintestinal pathogens. We identified a large metabolic network that is functional in gastrointestinal pathogens but decaying in extraintestinal pathogens. While taxonomists have used traits from this network empirically for many decades for the enrichment and biochemical discrimination of Salmonella serovars, our findings suggest that it is part of a "business plan" for growth in the inflamed gastrointestinal tract. By identifying a large metabolic network characteristic of Salmonella serovars associated with gastroenteritis, our in silico analysis provides a blueprint for potential strategies to utilize inflammation-derived nutrients and edge out competing gut microbes.

  20. Independent component analysis of high-resolution imaging data identifies distinct functional domains

    DEFF Research Database (Denmark)

    Reidl, Juergen; Starke, Jens; Omer, David

    2007-01-01

    In the vertebrate brain external stimuli are often represented in distinct functional domains distributed across the cortical surface. Fast imaging techniques used to measure patterns of population activity record movies with many pixels and many frames, i.e. data sets with high dimensionality....... Here we demonstrate that principal component analysis (PCA) followed by spatial independent component analysis (sICA), can be exploited to reduce the dimensionality of data sets recorded in the olfactory bulb and the somatosensory cortex of mice as well as the visual cortex of monkeys, without loosing...... analysis and dissection of imaging data of population activity, collected with high spatial and temporal resolution....

  1. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  2. Stalked protozoa identification by image analysis and multivariable statistical techniques.

    Science.gov (United States)

    Amaral, A L; Ginoris, Y P; Nicolau, A; Coelho, M A Z; Ferreira, E C

    2008-06-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determining the geometrical, morphological and signature data and subsequent processing by discriminant analysis and neural network techniques. Geometrical descriptors were found to be responsible for the best identification ability and the identification of the crucial Opercularia and Vorticella microstoma microorganisms provided some degree of confidence to establish their presence in wastewater treatment plants.

  3. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  4. Recovering prehistoric woodworking skills using spatial analysis techniques

    Science.gov (United States)

    Kovács, K.; Hanke, K.

    2015-08-01

    Recovering of ancient woodworking skills can be achieved by the simultaneous documentation and analysis of the tangible evidences such as the geometry parameters of prehistoric hand tools or the fine morphological characteristics of well preserved wooden archaeological finds. During this study, altogether 10 different hand tool forms and over 60 hand tool impressions were investigated for the better understanding of the Bronze Age woodworking efficiency. Two archaeological experiments were also designed in this methodology and unknown prehistoric adzes could be reconstructed by the results of these studies and by the spatial analysis of the Bronze Age tool marks. Finally, the trimming efficiency of these objects were also implied and these woodworking skills could be quantified in the case of a Bronze Age wooden construction from Austria. The proposed GIS-based tool mark segmentation and comparison can offer an objective, user-independent technique for the related intangible heritage interpretations in the future.

  5. An operational modal analysis approach based on parametrically identified multivariable transmissibilities

    Science.gov (United States)

    Devriendt, Christof; De Sitter, Gert; Guillaume, Patrick

    2010-07-01

    In this contribution the approach to identify modal parameters from output-only (scalar) transmissibility measurements [C. Devriendt, P. Guillaume, The use of transmissibility measurements in output-only modal analysis, Mechanical Systems and Signal Processing 21 (7) (2007) 2689-2696] is generalized to multivariable transmissibilities. In general, the poles that are identified from (scalar as well as multivariable) transmissibility measurements do not correspond with the system's poles. However, by combining transmissibility measurements under different loading conditions, it is shown in this paper how model parameters can be identified from multivariable transmissibility measurements.

  6. Application of techniques to identify coal-mine and power-generation effects on surface-water quality, San Juan River basin, New Mexico and Colorado

    Science.gov (United States)

    Goetz, C.L.; Abeyta, Cynthia G.; Thomas, E.V.

    1987-01-01

    Numerous analytical techniques were applied to determine water quality changes in the San Juan River basin upstream of Shiprock , New Mexico. Eight techniques were used to analyze hydrologic data such as: precipitation, water quality, and streamflow. The eight methods used are: (1) Piper diagram, (2) time-series plot, (3) frequency distribution, (4) box-and-whisker plot, (5) seasonal Kendall test, (6) Wilcoxon rank-sum test, (7) SEASRS procedure, and (8) analysis of flow adjusted, specific conductance data and smoothing. Post-1963 changes in dissolved solids concentration, dissolved potassium concentration, specific conductance, suspended sediment concentration, or suspended sediment load in the San Juan River downstream from the surface coal mines were examined to determine if coal mining was having an effect on the quality of surface water. None of the analytical methods used to analyzed the data showed any increase in dissolved solids concentration, dissolved potassium concentration, or specific conductance in the river downstream from the mines; some of the analytical methods used showed a decrease in dissolved solids concentration and specific conductance. Chaco River, an ephemeral stream tributary to the San Juan River, undergoes changes in water quality due to effluent from a power generation facility. The discharge in the Chaco River contributes about 1.9% of the average annual discharge at the downstream station, San Juan River at Shiprock, NM. The changes in water quality detected at the Chaco River station were not detected at the downstream Shiprock station. It was not possible, with the available data, to identify any effects of the surface coal mines on water quality that were separable from those of urbanization, agriculture, and other cultural and natural changes. In order to determine the specific causes of changes in water quality, it would be necessary to collect additional data at strategically located stations. (Author 's abstract)

  7. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  8. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    Science.gov (United States)

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  9. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  10. Radio & Optical Interferometry: Basic Observing Techniques and Data Analysis

    CERN Document Server

    Monnier, John D

    2012-01-01

    Astronomers usually need the highest angular resolution possible, but the blurring effect of diffraction imposes a fundamental limit on the image quality from any single telescope. Interferometry allows light collected at widely-separated telescopes to be combined in order to synthesize an aperture much larger than an individual telescope thereby improving angular resolution by orders of magnitude. Radio and millimeter wave astronomers depend on interferometry to achieve image quality on par with conventional visible and infrared telescopes. Interferometers at visible and infrared wavelengths extend angular resolution below the milli-arcsecond level to open up unique research areas in imaging stellar surfaces and circumstellar environments. In this chapter the basic principles of interferometry are reviewed with an emphasis on the common features for radio and optical observing. While many techniques are common to interferometers of all wavelengths, crucial differences are identified that will help new practi...

  11. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    Energy Technology Data Exchange (ETDEWEB)

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  12. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  13. Reliability and validity of a palpation technique for identifying the spinous processes of C7 and L5.

    Science.gov (United States)

    Robinson, Roar; Robinson, Hilde Stendal; Bjørke, Gustav; Kvale, Alice

    2009-08-01

    The objective was to examine inter-tester reliability and validity of two therapists identifying the spinous processes (SP) of C7 and L5, using one predefined surface palpation procedure for each level. One identification method made it possible to examine the reliability and the validity of the procedure itself. Two manual therapists examined 49 patients (29 women). Aged between 26 and 79 years, 18 were cervical and 31 lumbar patients. An invisible marking pen and ultraviolet light were used, and the findings were compared. X-rays were taken as an objective measure of the correct spinal level. Percentage agreement and kappa statistics were used to evaluate reliability and validity. The best inter-therapist agreement was found for the skin marks. Percentage agreement within 10mm and 20mm was 67% and 85%, respectively. The inter-tester reliability for identifying a radiological nominated SP by palpation was found to be poor for C7 and moderate for L5, with kappa of 0.18 and 0.48, respectively. The results indicated acceptable inter-therapist surface palpation agreement, but the chosen procedures did not identify the correct SP. This indicates that the procedures are not precise enough. Future reliability studies should test other non-invasive palpation procedures, both individually and in combination, and compare these with radiological investigation.

  14. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real Integral-Field Spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of two. Our analysis reveals that the algorithm prioritizes conservation of all the statistically-significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BATMAN is not to be used as a `black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially-resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  15. Sparse canonical correlation analysis for identifying, connecting and completing gene-expression networks

    Directory of Open Access Journals (Sweden)

    Zwinderman Aeilko H

    2009-09-01

    Full Text Available Abstract Background We generalized penalized canonical correlation analysis for analyzing microarray gene-expression measurements for checking completeness of known metabolic pathways and identifying candidate genes for incorporation in the pathway. We used Wold's method for calculation of the canonical variates, and we applied ridge penalization to the regression of pathway genes on canonical variates of the non-pathway genes, and the elastic net to the regression of non-pathway genes on the canonical variates of the pathway genes. Results We performed a small simulation to illustrate the model's capability to identify new candidate genes to incorporate in the pathway: in our simulations it appeared that a gene was correctly identified if the correlation with the pathway genes was 0.3 or more. We applied the methods to a gene-expression microarray data set of 12, 209 genes measured in 45 patients with glioblastoma, and we considered genes to incorporate in the glioma-pathway: we identified more than 25 genes that correlated > 0.9 with canonical variates of the pathway genes. Conclusion We concluded that penalized canonical correlation analysis is a powerful tool to identify candidate genes in pathway analysis.

  16. Identifying Effective Spelling Interventions Using a Brief Experimental Analysis and Extended Analysis

    Science.gov (United States)

    McCurdy, Merilee; Clure, Lynne F.; Bleck, Amanda A.; Schmitz, Stephanie L.

    2016-01-01

    Spelling is an important skill that is crucial to effective written communication. In this study, brief experimental analysis procedures were used to examine spelling instruction strategies (e.g., whole word correction; word study strategy; positive practice; and cover, copy, and compare) for four students. In addition, an extended analysis was…

  17. Application of different techniques to identify the effects of irradiation on Brazilian beans after six months storage

    Energy Technology Data Exchange (ETDEWEB)

    Villavicencio, A.L.C.H.; Mancini-Filho, J.; Delincee, H

    1998-06-01

    Four different techniques to detect the effect of irradiation in beans were investigated. Two types of Brazilian beans, Phaseolus vulgaris L., var. carioca and Vigna unguiculata (L.) Walp, var. macacar, were irradiated using a {sup 60}Co source with doses ranging from 0, 1.0 to 10.0 kGy. After 6 months storage at ambient temperature the detection tests were carried out. Firstly, germination tests showed markedly reduced root growth and almost totally retarded shoot elongation of irradiated beans as compared to non-irradiated beans. Secondly, DNA fragmentation was studied using a microgel electrophoresis. Irradiated cells produced typical comets with DNA fragments migrating towards the anode. DNA of non-irradiated cells exhibited a limited migration. Thirdly, electron spin resonance for detection of cellulose radicals was tested, since it was expected that these free radicals are quite stable in solid and dry foods. However, only in beans irradiated with 10 kGy a small signal could be detected. Fourthly, thermoluminescence, a method to analyze mineral debris adhering to food, turned out to be a good choice to detect irradiation effects in beans, even after 6 months of storage. The results indicate that three of these four techniques proposed, can be used to detect the effect of irradiation in these two varieties of Brazilian beans at a dose level useful for insect disinfestation (1 kGy)

  18. Application of different techniques to identify the effects of irradiation on Brazilian beans after six months storage

    Science.gov (United States)

    Villavicencio, A. L. C. H.; Mancini-Filho, J.; Delincée, H.

    1998-06-01

    Four different techniques to detect the effect of irradiation in beans were investigated. Two types of Brazilian beans, Phaseolus vulgaris L., var. carioca and Vigna unguiculata (L.) Walp, var. macaçar, were irradiated using a 60Co source with doses ranging from 10.0 kGy. After 6 months storage at ambient temperature the detection tests were carried out. Firstly, germination tests showed markedly reduced root growth and almost totally retarded shoot elongation of irradiated beans as compared to non-irradiated beans. Secondly, DNA fragmentation was studied using a microgel electrophoresis. Irradiated cells produced typical comets with DNA fragments migrating towards the anode. DNA of non-irradiated cells exhibited a limited migration. Thirdly, electron spin resonance for detection of cellulose radicals was tested, since it was expected that these free radicals are quite stable in solid and dry foods. However, only in beans irradiated with 10 kGy a small signal could be detected. Fourtly, thermoluminescence, a method to analyze mineral debris adhering to food, turned out to be a good choice to detect irradiation effects in beans, even after 6 months of storage. The results indicate that three of these four techniques proposed, can be used to detect the effect of irradiation in these two varieties of Brazilian beans at a dose level useful for insect disinfestation (1 kGy).

  19. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    NARCIS (Netherlands)

    B.F. Voight (Benjamin); L.J. Scott (Laura); V. Steinthorsdottir (Valgerdur); A.D. Morris (Andrew); C. Dina (Christian); R.P. Welch (Ryan); E. Zeggini (Eleftheria); C. Huth (Cornelia); Y.S. Aulchenko (Yurii); G. Thorleifsson (Gudmar); L.J. McCulloch (Laura); T. Ferreira (Teresa); H. Grallert (Harald); N. Amin (Najaf); G. Wu (Guanming); C.J. Willer (Cristen); S. Raychaudhuri (Soumya); S.A. McCarroll (Steven); C. Langenberg (Claudia); O.M. Hofmann (Oliver); J. Dupuis (Josée); L. Qi (Lu); A.V. Segrè (Ayellet); M. van Hoek (Mandy); P. Navarro (Pau); K.G. Ardlie (Kristin); B. Balkau (Beverley); R. Benediktsson (Rafn); A.J. Bennett (Amanda); R. Blagieva (Roza); E.A. Boerwinkle (Eric); L.L. Bonnycastle (Lori); K.B. Boström (Kristina Bengtsson); B. Bravenboer (Bert); S. Bumpstead (Suzannah); N.P. Burtt (Noël); G. Charpentier (Guillaume); P.S. Chines (Peter); M. Cornelis (Marilyn); D.J. Couper (David); G. Crawford (Gabe); A.S.F. Doney (Alex); K.S. Elliott (Katherine); M.R. Erdos (Michael); C.S. Fox (Caroline); C.S. Franklin (Christopher); M. Ganser (Martha); C. Gieger (Christian); N. Grarup (Niels); T. Green (Todd); S. Griffin (Simon); C.J. Groves (Christopher); C. Guiducci (Candace); S. Hadjadj (Samy); N. Hassanali (Neelam); C. Herder (Christian); B. Isomaa (Bo); A.U. Jackson (Anne); P.R.V. Johnson (Paul); T. Jørgensen (Torben); W.H.L. Kao (Wen); N. Klopp (Norman); A. Kong (Augustine); P. Kraft (Peter); J. Kuusisto (Johanna); T. Lauritzen (Torsten); M. Li (Man); A. Lieverse (Aloysius); C.M. Lindgren (Cecilia); V. Lyssenko (Valeriya); M. Marre (Michel); T. Meitinger (Thomas); K. Midthjell (Kristian); M.A. Morken (Mario); N. Narisu (Narisu); P. Nilsson (Peter); K.R. Owen (Katharine); F. Payne (Felicity); J.R.B. Perry (John); A.K. Petersen; C. Platou (Carl); C. Proença (Christine); I. Prokopenko (Inga); W. Rathmann (Wolfgang); N.W. Rayner (Nigel William); N.R. Robertson (Neil); G. Rocheleau (Ghislain); M. Roden (Michael); M.J. Sampson (Michael); R. Saxena (Richa); B.M. Shields (Beverley); P. Shrader (Peter); G. Sigurdsson (Gunnar); T. Sparsø (Thomas); K. Strassburger (Klaus); H.M. Stringham (Heather); Q. Sun (Qi); A.J. Swift (Amy); B. Thorand (Barbara); J. Tichet (Jean); T. Tuomi (Tiinamaija); R.M. van Dam (Rob); T.W. van Haeften (Timon); T.W. van Herpt (Thijs); J.V. van Vliet-Ostaptchouk (Jana); G.B. Walters (Bragi); M.N. Weedon (Michael); C. Wijmenga (Cisca); J.C.M. Witteman (Jacqueline); R.N. Bergman (Richard); S. Cauchi (Stephane); F.S. Collins (Francis); A.L. Gloyn (Anna); U. Gyllensten (Ulf); T. Hansen (Torben); W.A. Hide (Winston); G.A. Hitman (Graham); A. Hofman (Albert); D. Hunter (David); K. Hveem (Kristian); M. Laakso (Markku); K.L. Mohlke (Karen); C.N.A. Palmer (Colin); P.P. Pramstaller (Peter Paul); I. Rudan (Igor); E.J.G. Sijbrands (Eric); L.D. Stein (Lincoln); J. Tuomilehto (Jaakko); A.G. Uitterlinden (André); M. Walker (Mark); N.J. Wareham (Nick); G.R. Abecasis (Gonçalo); B.O. Boehm (Bernhard); H. Campbell (Harry); M.J. Daly (Mark); A.T. Hattersley (Andrew); F.B. Hu (Frank); J.B. Meigs (James); J.S. Pankow (James); O. Pedersen (Oluf); H.E. Wichmann (Erich); I. Barroso (Inês); J.C. Florez (Jose); T.M. Frayling (Timothy); L. Groop (Leif); R. Sladek (Rob); U. Thorsteinsdottir (Unnur); J.F. Wilson (James); T. Illig (Thomas); P. Froguel (Philippe); P. Tikka-Kleemola (Päivi); J-A. Zwart (John-Anker); D. Altshuler (David); M. Boehnke (Michael); M.I. McCarthy (Mark); R.M. Watanabe (Richard)

    2010-01-01

    textabstractBy combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals w

  20. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    NARCIS (Netherlands)

    Voight, Benjamin F.; Scott, Laura J.; Steinthorsdottir, Valgerdur; Morris, Andrew P.; Dina, Christian; Welch, Ryan P.; Zeggini, Eleftheria; Huth, Cornelia; Aulchenko, Yurii S.; Thorleifsson, Gudmar; McCulloch, Laura J.; Ferreira, Teresa; Grallert, Harald; Amin, Najaf; Wu, Guanming; Willer, Cristen J.; Raychaudhuri, Soumya; McCarroll, Steve A.; Langenberg, Claudia; Hofmann, Oliver M.; Dupuis, Josee; Qi, Lu; Segre, Ayellet V.; van Hoek, Mandy; Navarro, Pau; Ardlie, Kristin; Balkau, Beverley; Benediktsson, Rafn; Bennett, Amanda J.; Blagieva, Roza; Boerwinkle, Eric; Bonnycastle, Lori L.; Bostrom, Kristina Bengtsson; Bravenboer, Bert; Bumpstead, Suzannah; Burtt, Noisel P.; Charpentier, Guillaume; Chines, Peter S.; Cornelis, Marilyn; Couper, David J.; Crawford, Gabe; Doney, Alex S. F.; Elliott, Katherine S.; Elliott, Amanda L.; Erdos, Michael R.; Fox, Caroline S.; Franklin, Christopher S.; Ganser, Martha; Gieger, Christian; Grarup, Niels; Green, Todd; Griffin, Simon; Groves, Christopher J.; Guiducci, Candace; Hadjadj, Samy; Hassanali, Neelam; Herder, Christian; Isomaa, Bo; Jackson, Anne U.; Johnson, Paul R. V.; Jorgensen, Torben; Kao, Wen H. L.; Klopp, Norman; Kong, Augustine; Kraft, Peter; Kuusisto, Johanna; Lauritzen, Torsten; Li, Man; Lieverse, Aloysius; Lindgren, Cecilia M.; Lyssenko, Valeriya; Marre, Michel; Meitinger, Thomas; Midthjell, Kristian; Morken, Mario A.; Narisu, Narisu; Nilsson, Peter; Owen, Katharine R.; Payne, Felicity; Perry, John R. B.; Petersen, Ann-Kristin; Platou, Carl; Proenca, Christine; Prokopenko, Inga; Rathmann, Wolfgang; Rayner, N. William; Robertson, Neil R.; Rocheleau, Ghislain; Roden, Michael; Sampson, Michael J.; Saxena, Richa; Shields, Beverley M.; Shrader, Peter; Sigurdsson, Gunnar; Sparso, Thomas; Strassburger, Klaus; Stringham, Heather M.; Sun, Qi; Swift, Amy J.; Thorand, Barbara; Tichet, Jean; Tuomi, Tiinamaija; van Dam, Rob M.; van Haeften, Timon W.; van Herpt, Thijs; van Vliet-Ostaptchouk, Jana V.; Walters, G. Bragi; Weedon, Michael N.; Wijmenga, Cisca; Witteman, Jacqueline; Bergman, Richard N.; Cauchi, Stephane; Collins, Francis S.; Gloyn, Anna L.; Gyllensten, Ulf; Hansen, Torben; Hide, Winston A.; Hitman, Graham A.; Hofman, Albert; Hunter, David J.; Hveem, Kristian; Laakso, Markku; Mohlke, Karen L.; Morris, Andrew D.; Palmer, Colin N. A.; Pramstaller, Peter P.; Rudan, Igor; Sijbrands, Eric; Stein, Lincoln D.; Tuomilehto, Jaakko; Uitterlinden, Andre; Walker, Mark; Wareham, Nicholas J.; Watanabe, Richard M.; Abecasis, Goncalo R.; Boehm, Bernhard O.; Campbell, Harry; Daly, Mark J.; Hattersley, Andrew T.; Hu, Frank B.; Meigs, James B.; Pankow, James S.; Pedersen, Oluf; Wichmann, H-Erich; Barroso, Ines; Florez, Jose C.; Frayling, Timothy M.; Groop, Leif; Sladek, Rob; Thorsteinsdottir, Unnur; Wilson, James F.; Illig, Thomas; Froguel, Philippe; van Duijn, Cornelia M.; Stefansson, Kari; Altshuler, David; Boehnke, Michael; McCarthy, Mark I.

    2010-01-01

    By combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals with combined

  1. Using Work Action Analysis to Identify Web-Portal Requirements for a Professional Development Program

    Science.gov (United States)

    Nickles, George

    2007-01-01

    This article describes using Work Action Analysis (WAA) as a method for identifying requirements for a web-based portal that supports a professional development program. WAA is a cognitive systems engineering method for modeling multi-agent systems to support design and evaluation. A WAA model of the professional development program of the…

  2. Genome-wide association scan meta-analysis identifies three loci influencing adiposity and fat distribution

    NARCIS (Netherlands)

    C.M. Lindgren (Cecilia); I.M. Heid (Iris); J.C. Randall (Joshua); C. Lamina (Claudia); V. Steinthorsdottir (Valgerdur); L. Qi (Lu); E.K. Speliotes (Elizabeth); G. Thorleifsson (Gudmar); C.J. Willer (Cristen); B.M. Herrera (Blanca); A.U. Jackson (Anne); N. Lim (Noha); P. Scheet (Paul); N. Soranzo (Nicole); N. Amin (Najaf); Y.S. Aulchenko (Yurii); J.C. Chambers (John); A. Drong (Alexander); J. Luan; H.N. Lyon (Helen); F. Rivadeneira Ramirez (Fernando); S. Sanna (Serena); N. Timpson (Nicholas); M.C. Zillikens (Carola); H.Z. Jing; P. Almgren (Peter); S. Bandinelli (Stefania); A.J. Bennett (Amanda); R.N. Bergman (Richard); L.L. Bonnycastle (Lori); S. Bumpstead (Suzannah); S.J. Chanock (Stephen); L. Cherkas (Lynn); P.S. Chines (Peter); L. Coin (Lachlan); C. Cooper (Charles); G. Crawford (Gabe); A. Doering (Angela); A. Dominiczak (Anna); A.S.F. Doney (Alex); S. Ebrahim (Shanil); P. Elliott (Paul); M.R. Erdos (Michael); K. Estrada Gil (Karol); L. Ferrucci (Luigi); G. Fischer (Guido); N.G. Forouhi (Nita); C. Gieger (Christian); H. Grallert (Harald); C.J. Groves (Christopher); S.M. Grundy (Scott); C. Guiducci (Candace); D. Hadley (David); A. Hamsten (Anders); A.S. Havulinna (Aki); A. Hofman (Albert); R. Holle (Rolf); J.W. Holloway (John); T. Illig (Thomas); B. Isomaa (Bo); L.C. Jacobs (Leonie); K. Jameson (Karen); P. Jousilahti (Pekka); F. Karpe (Fredrik); J. Kuusisto (Johanna); J. Laitinen (Jaana); G.M. Lathrop (Mark); D.A. Lawlor (Debbie); M. Mangino (Massimo); W.L. McArdle (Wendy); T. Meitinger (Thomas); M.A. Morken (Mario); A.P. Morris (Andrew); P. Munroe (Patricia); N. Narisu (Narisu); A. Nordström (Anna); B.A. Oostra (Ben); C.N.A. Palmer (Colin); F. Payne (Felicity); J. Peden (John); I. Prokopenko (Inga); F. Renström (Frida); A. Ruokonen (Aimo); V. Salomaa (Veikko); M.S. Sandhu (Manjinder); L.J. Scott (Laura); A. Scuteri (Angelo); K. Silander (Kaisa); K. Song (Kijoung); X. Yuan (Xin); H.M. Stringham (Heather); A.J. Swift (Amy); T. Tuomi (Tiinamaija); M. Uda (Manuela); P. Vollenweider (Peter); G. Waeber (Gérard); C. Wallace (Chris); G.B. Walters (Bragi); M.N. Weedon (Michael); J.C.M. Witteman (Jacqueline); C. Zhang (Cuilin); M. Caulfield (Mark); F.S. Collins (Francis); G.D. Smith; I.N.M. Day (Ian); P.W. Franks (Paul); A.T. Hattersley (Andrew); F.B. Hu (Frank); M.-R. Jarvelin (Marjo-Riitta); A. Kong (Augustine); J.S. Kooner (Jaspal); M. Laakso (Markku); E. Lakatta (Edward); V. Mooser (Vincent); L. Peltonen (Leena Johanna); N.J. Samani (Nilesh); T.D. Spector (Timothy); D.P. Strachan (David); T. Tanaka (Toshiko); J. Tuomilehto (Jaakko); A.G. Uitterlinden (André); P. Tikka-Kleemola (Päivi); N.J. Wareham (Nick); H. Watkins (Hugh); D. Waterworth (Dawn); M. Boehnke (Michael); P. Deloukas (Panagiotis); L. Groop (Leif); D.J. Hunter (David); U. Thorsteinsdottir (Unnur); D. Schlessinger (David); H.E. Wichmann (Erich); T.M. Frayling (Timothy); G.R. Abecasis (Gonçalo); J.N. Hirschhorn (Joel); R.J.F. Loos (Ruth); J-A. Zwart (John-Anker); K.L. Mohlke (Karen); I. Barroso (Inês); M.I. McCarthy (Mark)

    2009-01-01

    textabstractTo identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist-hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evid

  3. Examination of a One-Trial Brief Experimental Analysis to Identify Reading Fluency Interventions

    Science.gov (United States)

    Andersen, Melissa N.; Daly, Edward J., III; Young, Nicholas D.

    2013-01-01

    This study sought to evaluate whether a one-trial brief experimental analysis (OTBEA) would reliably and validly identify effective treatments to improve oral reading fluency for 6 elementary school students referred for reading problems. An OTBEA was conducted with each participant to assess the effects of skill- and performance-based treatment…

  4. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    DEFF Research Database (Denmark)

    Voight, Benjamin F; Scott, Laura J; Steinthorsdottir, Valgerdur;

    2010-01-01

    By combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals...

  5. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    Science.gov (United States)

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  6. Bioinformatics analysis identifies several intrinsically disordered human E3 ubiquitin-protein ligases

    DEFF Research Database (Denmark)

    Boomsma, Wouter Krogh; Nielsen, Sofie Vincents; Lindorff-Larsen, Kresten;

    2016-01-01

    conduct a bioinformatics analysis to examine >600 human and S. cerevisiae E3 ligases to identify enzymes that are similar to San1 in terms of function and/or mechanism of substrate recognition. An initial sequence-based database search was found to detect candidates primarily based on the homology...

  7. Genome-wide association study meta-analysis identifies seven new rheumatoid arthritis risk loci

    NARCIS (Netherlands)

    Stahl, Eli A.; Raychaudhuri, Soumya; Remmers, Elaine F.; Xie, Gang; Eyre, Stephen; Thomson, Brian P.; Li, Yonghong; Kurreeman, Fina A. S.; Zhernakova, Alexandra; Hinks, Anne; Guiducci, Candace; Chen, Robert; Alfredsson, Lars; Amos, Christopher I.; Ardlie, Kristin G.; Barton, Anne; Bowes, John; Brouwer, Elisabeth; Burtt, Noel P.; Catanese, Joseph J.; Coblyn, Jonathan; Coenen, Marieke J. H.; Costenbader, Karen H.; Criswell, Lindsey A.; Crusius, J. Bart A.; Cui, Jing; de Bakker, Paul I. W.; De Jager, Philip L.; Ding, Bo; Emery, Paul; Flynn, Edward; Harrison, Pille; Hocking, Lynne J.; Huizinga, Tom W. J.; Kastner, Daniel L.; Ke, Xiayi; Lee, Annette T.; Liu, Xiangdong; Martin, Paul; Morgan, Ann W.; Padyukov, Leonid; Posthumus, Marcel D.; Radstake, Timothy R. D. J.; Reid, David M.; Seielstad, Mark; Seldin, Michael F.; Shadick, Nancy A.; Steer, Sophia; Tak, Paul P.; Thomson, Wendy; van der Helm-van Mil, Annette H. M.; van der Horst-Bruinsma, Irene E.; van der Schoot, C. Ellen; van Riel, Piet L. C. M.; Weinblatt, Michael E.; Wilson, Anthony G.; Wolbink, Gert Jan; Wordsworth, B. Paul; Wijmenga, Cisca; Karlson, Elizabeth W.; Toes, Rene E. M.; de Vries, Niek; Begovich, Ann B.; Worthington, Jane; Siminovitch, Katherine A.; Gregersen, Peter K.; Klareskog, Lars; Plenge, Robert M.

    2010-01-01

    To identify new genetic risk factors for rheumatoid arthritis, we conducted a genome-wide association study meta-analysis of 5,539 autoantibody-positive individuals with rheumatoid arthritis (cases) and 20,169 controls of European descent, followed by replication in an independent set of 6,768 rheum

  8. Identifying Skill Requirements for GIS Positions: A Content Analysis of Job Advertisements

    Science.gov (United States)

    Hong, Jung Eun

    2016-01-01

    This study identifies the skill requirements for geographic information system (GIS) positions, including GIS analysts, programmers/developers/engineers, specialists, and technicians, through a content analysis of 946 GIS job advertisements from 2007-2014. The results indicated that GIS job applicants need to possess high levels of GIS analysis…

  9. Clinical Trial Registries Are of Minimal Use for Identifying Selective Outcome and Analysis Reporting

    Science.gov (United States)

    Norris, Susan L.; Holmer, Haley K.; Fu, Rongwei; Ogden, Lauren A.; Viswanathan, Meera S.; Abou-Setta, Ahmed M.

    2014-01-01

    Objective: This study aimed to examine selective outcome reporting (SOR) and selective analysis reporting (SAR) in randomized controlled trials (RCTs) and to explore the usefulness of trial registries for identifying SOR and SAR. Study Design and Setting: We selected one "index outcome" for each of three comparative effectiveness reviews…

  10. When noisy neighbors are a blessing: analysis of gene expression noise identifies coregulated genes

    NARCIS (Netherlands)

    Junker, J.P.; van Oudenaarden, A.

    2012-01-01

    In this issue of Molecular Cell, Stewart-Ornstein et al. (2012) use systematic pair-wise correlation analysis of expression noise in a large number of yeast genes to identify clusters of functionally related genes and signaling pathways responsible for elevated noise.

  11. Identifying sustainability issues using participatory SWOT analysis - A case study of egg production in the Netherlands

    NARCIS (Netherlands)

    Mollenhorst, H.; Boer, de I.J.M.

    2004-01-01

    The aim of this paper was to demonstrate how participatory strengths, weaknesses, opportunities and threats (SWOT) analysis can be used to identify relevant economic, ecological and societal (EES) issues for the assessment of sustainable development. This is illustrated by the case of egg production

  12. Gene expression meta-analysis identifies chromosomal regions involved in ovarian cancer survival

    DEFF Research Database (Denmark)

    Thomassen, Mads; Jochumsen, Kirsten M; Mogensen, Ole;

    2009-01-01

    the relation of gene expression and chromosomal position to identify chromosomal regions of importance for early recurrence of ovarian cancer. By use of *Gene Set Enrichment Analysis*, we have ranked chromosomal regions according to their association to survival. Over-representation analysis including 1......Ovarian cancer cells exhibit complex karyotypic alterations causing deregulation of numerous genes. Some of these genes are probably causal for cancer formation and local growth, whereas others are causal for metastasis and recurrence. By using publicly available data sets, we have investigated......-4 consecutive cytogenetic bands identified regions with increased expression for chromosome 5q12-14, and a very large region of chromosome 7 with the strongest signal at 7p15-13 among tumors from short-living patients. Reduced gene expression was identified at 4q26-32, 6p12-q15, 9p21-q32, and 11p14-11. We...

  13. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample.

  14. Research fronts analysis : A bibliometric to identify emerging fields of research

    Science.gov (United States)

    Miwa, Sayaka; Ando, Satoko

    Research fronts analysis identifies emerging areas of research through observing co-clustering in highly-cited papers. This article introduces the concept of research fronts analysis, explains its methodology and provides case examples. It also demonstrates developing research fronts in Japan by looking at the past winners of Thomson Reuters Research Fronts Awards. Research front analysis is currently being used by the Japanese government to determine new trends in science and technology. Information professionals can also utilize this bibliometric as a research evaluation tool.

  15. Modal shape identification of large structure exposed to wind excitation by operational modal analysis technique

    Science.gov (United States)

    De Vivo, A.; Brutti, C.; Leofanti, J. L.

    2013-08-01

    Research efforts during recent decades qualify Operational Modal Analysis (OMA) as an interesting tool that is able to identify the modal characteristic parameters of structures excited randomly by environmental loads, eliminating the problem of measuring the external exciting forces. In this paper, an existing OMA technique, the Natural Excitation Technique (NExT) was studied and implemented in order to achieve, from the wind force, the modal parameters of Vega Launcher, the new European launcher vehicle for small and medium satellites. Following a brief summary of the fundamental equations of the method, the modal parameters of Vega are calculated using the OMA technique; the results are then compared with those achieved using a traditional Experimental Modal Analysis under excitation induced by shakers. The comparison shows there is a very good agreement between the results obtained by the two different methods, OMA and the traditional experimental analysis, proving that OMA is a reliable tool to analyse the dynamic behaviour of large structures. Finally, this approach can be used for any type of large structure in civil and mechanical fields and the technique appears to be very promising for further applications.

  16. Dynamic Cluster Analysis: An Unbiased Method for Identifying A+2 Element Containing Compounds in Liquid Chromatographic High-Resolution TOF Mass Spectrometric Data

    DEFF Research Database (Denmark)

    Andersen, Aaron John Christian; Hansen, Per Juel; Jørgensen, Kevin

    2016-01-01

    Dynamic Cluster Analysis (DCA) is an automated, unbiased technique which can identify Cl, Br, S, and other A+2 element containing metabolites in liquid chromatographic high resolution mass spectrometric data. DCA is based on three features, primarily the previously unutilised A+1 to A+2 isotope c...

  17. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  18. Techniques of DNA methylation analysis with nutritional applications.

    Science.gov (United States)

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests.

  19. A cross-species genetic analysis identifies candidate genes for mouse anxiety and human bipolar disorder

    Directory of Open Access Journals (Sweden)

    David G Ashbrook

    2015-07-01

    Full Text Available Bipolar disorder (BD is a significant neuropsychiatric disorder with a lifetime prevalence of ~1%. To identify genetic variants underlying BD genome-wide association studies (GWAS have been carried out. While many variants of small effect associated with BD have been identified few have yet been confirmed, partly because of the low power of GWAS due to multiple comparisons being made. Complementary mapping studies using murine models have identified genetic variants for behavioral traits linked to BD, often with high power, but these identified regions often contain too many genes for clear identification of candidate genes. In the current study we have aligned human BD GWAS results and mouse linkage studies to help define and evaluate candidate genes linked to BD, seeking to use the power of the mouse mapping with the precision of GWAS. We use quantitative trait mapping for open field test and elevated zero maze data in the largest mammalian model system, the BXD recombinant inbred mouse population, to identify genomic regions associated with these BD-like phenotypes. We then investigate these regions in whole genome data from the Psychiatric Genomics Consortium’s bipolar disorder GWAS to identify candidate genes associated with BD. Finally we establish the biological relevance and pathways of these genes in a comprehensive systems genetics analysis.We identify four genes associated with both mouse anxiety and human BD. While TNR is a novel candidate for BD, we can confirm previously suggested associations with CMYA5, MCTP1 and RXRG. A cross-species, systems genetics analysis shows that MCTP1, RXRG and TNR coexpress with genes linked to psychiatric disorders and identify the striatum as a potential site of action. CMYA5, MCTP1, RXRG and TNR are associated with mouse anxiety and human BD. We hypothesize that MCTP1, RXRG and TNR influence intercellular signaling in the striatum.

  20. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    Directory of Open Access Journals (Sweden)

    T. Subramani

    2014-06-01

    Full Text Available This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case study on natural base isolation using naturally available soils is presented. Also, the future areas of research are indicated. Earthquakes are one of nature IS greatest hazards; throughout historic time they have caused significant loss offline and severe damage to property, especially to man-made structures. On the other hand, earthquakes provide architects and engineers with a number of important design criteria foreign to the normal design process. From well established procedures reviewed by many researchers, seismic isolation may be used to provide an effective solution for a wide range of seismic design problems. The application of the base isolation techniques to protect structures against damage from earthquake attacks has been considered as one of the most effective approaches and has gained increasing acceptance during the last two decades. This is because base isolation limits the effects of the earthquake attack, a flexible base largely decoupling the structure from the ground motion, and the structural response accelerations are usually less than the ground acceleration. In general, the increase of additional viscous damping in the structure may reduce displacement and acceleration responses of the structure. This study also seeks to evaluate the effects of additional damping on the seismic response when compared with structures without additional damping for the different ground motions.

  1. Identifying Critical Factors of Sale Failure on Commercial Property Types, Shop Houses by Using Multi Attribute Variable Technique

    Directory of Open Access Journals (Sweden)

    N.I. Mohamad

    2014-04-01

    Full Text Available The focus of this research is to identify the critical factors of shop houses sale failure in Bandar Baru Nilai and further up to discover the critical factors of sale failure of commercial property types, shop houses in new township as report by valuation and Property services department (JPPH showed 5,931 units of shop houses in Malaysia is currently completed but remained unsold where Johor was recorded as the highest with unsold units followed by Negeri Sembilan. Bandar Baru Nilai (a district of Negeri Sembilan is chosen as research sample for unsold shop houses units due to its strategic location which is near to KLIA, International Sepang Circuit, educational instituitions and surrounded by housing scheme but yet still has numbers of unsold units. Data of the research is obtained from literature review and survey question between developers, local authority, purchasers/tenant and local residents. Relative Importance Index (RII method is applied in identifying the critical factor of shop houses sale failure. Generally, the factors of sale failure are economy, demography, politic, location and access, public and basic facilities, financial loan, physical of product, current stock of shop houses upon completion, future potential of subsale and rental, developer’s background, promotion and marketing, speculation and time.

  2. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    Directory of Open Access Journals (Sweden)

    Mahmoud I. Al-Kadi

    2013-05-01

    Full Text Available Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  3. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    Science.gov (United States)

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  4. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  5. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H. [Center for Experimental Nuclear Physics and Astrophysics, and Department of Physics, University of Washington, Seattle, WA (United States); Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L. [Pacific Northwest National Laboratory, Richland, WA (United States); Avignone, F. T. [Department of Physics and Astronomy, University of South Carolina, Columbia, SC (United States); Oak Ridge National Laboratory, Oak Ridge, TN (United States); Baldenegro-Barrera, C. X.; Bertrand, F. E. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  6. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  7. Radial velocity data analysis with compressed sensing techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2017-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian process framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick Observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  8. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  9. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  10. Radial Velocity Data Analysis with Compressed Sensing Techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  11. Adolescent baseball pitching technique: lower extremity biomechanical analysis.

    Science.gov (United States)

    Milewski, Matthew D; Õunpuu, Sylvia; Solomito, Matthew; Westwell, Melany; Nissen, Carl W

    2012-11-01

    Documentation of the lower extremity motion patterns of adolescent pitchers is an important part of understanding the pitching motion and the implication of lower extremity technique on upper extremity loads, injury and performance. The purpose of this study was to take the initial step in this process by documenting the biomechanics of the lower extremities during the pitching cycle in adolescent pitchers and to compare these findings with the published data for older pitchers. Three-dimensional motion analysis using a comprehensive lower extremity model was used to evaluate the fast ball pitch technique in adolescent pitchers. Thirty-two pitchers with a mean age of 12.4 years (range 10.5-14.7 years) and at least 2 years of experience were included in this study. The pitchers showed a mean of 49 ± 12° of knee flexion of the lead leg at foot contact. They tended to maintain this position through ball release, and then extended their knee during the follow through phase (ball release to maximal internal glenohumeral rotation). The lead leg hip rapidly progressed into adduction and flexion during the arm cocking phase with a range of motion of 40 ± 10° adduction and 30 ± 13° flexion. The lead hip mean peak adduction velocity was 434 ± 83°/s and flexion velocity was 456 ± 156°/s. Simultaneously, the trailing leg hip rapidly extended approaching to a mean peak extension of -8 ± 5° at 39% of the pitch cycle, which is close to passive range of motion constraints. Peak hip abduction of the trailing leg at foot contact was -31 ± 12°, which also approached passive range of motion constraints. Differences and similarities were also noted between the adolescent lower extremity kinematics and adult pitchers; however, a more comprehensive analysis using similar methods is needed for a complete comparison.

  12. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    Science.gov (United States)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  13. Cluster analysis of spontaneous preterm birth phenotypes identifies potential associations among preterm birth mechanisms

    Science.gov (United States)

    Esplin, M Sean; Manuck, Tracy A.; Varner, Michael W.; Christensen, Bryce; Biggio, Joseph; Bukowski, Radek; Parry, Samuel; Zhang, Heping; Huang, Hao; Andrews, William; Saade, George; Sadovsky, Yoel; Reddy, Uma M.; Ilekis, John

    2015-01-01

    Objective We sought to employ an innovative tool based on common biological pathways to identify specific phenotypes among women with spontaneous preterm birth (SPTB), in order to enhance investigators' ability to identify to highlight common mechanisms and underlying genetic factors responsible for SPTB. Study Design A secondary analysis of a prospective case-control multicenter study of SPTB. All cases delivered a preterm singleton at SPTB ≤34.0 weeks gestation. Each woman was assessed for the presence of underlying SPTB etiologies. A hierarchical cluster analysis was used to identify groups of women with homogeneous phenotypic profiles. One of the phenotypic clusters was selected for candidate gene association analysis using VEGAS software. Results 1028 women with SPTB were assigned phenotypes. Hierarchical clustering of the phenotypes revealed five major clusters. Cluster 1 (N=445) was characterized by maternal stress, cluster 2 (N=294) by premature membrane rupture, cluster 3 (N=120) by familial factors, and cluster 4 (N=63) by maternal comorbidities. Cluster 5 (N=106) was multifactorial, characterized by infection (INF), decidual hemorrhage (DH) and placental dysfunction (PD). These three phenotypes were highly correlated by Chi-square analysis [PD and DH (p<2.2e-6); PD and INF (p=6.2e-10); INF and DH (p=0.0036)]. Gene-based testing identified the INS (insulin) gene as significantly associated with cluster 3 of SPTB. Conclusion We identified 5 major clusters of SPTB based on a phenotype tool and hierarchal clustering. There was significant correlation between several of the phenotypes. The INS gene was associated with familial factors underlying SPTB. PMID:26070700

  14. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Lubna Moin

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  15. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  16. Combined Analysis of SNP Array Data Identifies Novel CNV Candidates and Pathways in Ependymoma and Mesothelioma

    Directory of Open Access Journals (Sweden)

    Gabriel Wajnberg

    2015-01-01

    Full Text Available Copy number variation is a class of structural genomic modifications that includes the gain and loss of a specific genomic region, which may include an entire gene. Many studies have used low-resolution techniques to identify regions that are frequently lost or amplified in cancer. Usually, researchers choose to use proprietary or non-open-source software to detect these regions because the graphical interface tends to be easier to use. In this study, we combined two different open-source packages into an innovative strategy to identify novel copy number variations and pathways associated with cancer. We used a mesothelioma and ependymoma published datasets to assess our tool. We detected previously described and novel copy number variations that are associated with cancer chemotherapy resistance. We also identified altered pathways associated with these diseases, like cell adhesion in patients with mesothelioma and negative regulation of glutamatergic synaptic transmission in ependymoma patients. In conclusion, we present a novel strategy using open-source software to identify copy number variations and altered pathways associated with cancer.

  17. Enhanced Analysis Techniques for an Imaging Neutron and Gamma Ray Spectrometer

    Science.gov (United States)

    Madden, Amanda C.

    The presence of gamma rays and neutrons is a strong indicator of the presence of Special Nuclear Material (SNM). The imaging Neutron and gamma ray SPECTrometer (NSPECT) developed by the University of New Hampshire and Michigan Aerospace corporation detects the fast neutrons and prompt gamma rays from fissile material, and the gamma rays from radioactive material. The instrument operates as a double scatter device, requiring a neutron or a gamma ray to interact twice in the instrument. While this detection requirement decreases the efficiency of the instrument, it offers superior background rejection and the ability to measure the energy and momentum of the incident particle. These measurements create energy spectra and images of the emitting source for source identification and localization. The dual species instrument provides superior detection than a single species alone. In realistic detection scenarios, few particles are detected from a potential threat due to source shielding, detection at a distance, high background, and weak sources. This contributes to a small signal to noise ratio, and threat detection becomes difficult. To address these difficulties, several enhanced data analysis tools were developed. A Receiver Operating Characteristic Curve (ROC) helps set instrumental alarm thresholds as well as to identify the presence of a source. Analysis of a dual-species ROC curve provides superior detection capabilities. Bayesian analysis helps to detect and identify the presence of a source through model comparisons, and helps create a background corrected count spectra for enhanced spectroscopy. Development of an instrument response using simulations and numerical analyses will help perform spectra and image deconvolution. This thesis will outline the principles of operation of the NSPECT instrument using the double scatter technology, traditional analysis techniques, and enhanced analysis techniques as applied to data from the NSPECT instrument, and an

  18. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  19. Pattern recognition software and techniques for biological image analysis.

    Science.gov (United States)

    Shamir, Lior; Delaney, John D; Orlov, Nikita; Eckley, D Mark; Goldberg, Ilya G

    2010-11-24

    The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  20. Efficient geometric rectification techniques for spectral analysis algorithm

    Science.gov (United States)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  1. Wavelength resolved neutron transmission analysis to identify single crystal particles in historical metallurgy

    Science.gov (United States)

    Barzagli, E.; Grazzi, F.; Salvemini, F.; Scherillo, A.; Sato, H.; Shinohara, T.; Kamiyama, T.; Kiyanagi, Y.; Tremsin, A.; Zoppi, Marco

    2014-07-01

    The phase composition and the microstructure of four ferrous Japanese arrows of the Edo period (17th-19th century) has been determined through two complementary neutron techniques: Position-sensitive wavelength-resolved neutron transmission analysis (PS-WRNTA) and time-of-flight neutron diffraction (ToF-ND). Standard ToF-ND technique has been applied by using the INES diffractometer at the ISIS pulsed neutron source in the UK, while the innovative PS-WRNTA one has been performed at the J-PARC neutron source on the BL-10 NOBORU beam line using the high spatial high time resolution neutron imaging detector. With ToF-ND we were able to reach information about the quantitative distribution of the metal and non-metal phases, the texture level, the strain level and the domain size of each of the samples, which are important parameters to gain knowledge about the technological level of the Japanese weapon. Starting from this base of data, the more complex PS-WRNTA has been applied to the same samples. This experimental technique exploits the presence of the so-called Bragg edges, in the time-of-flight spectrum of neutrons transmitted through crystalline materials, to map the microstructural properties of samples. The two techniques are non-invasive and can be easily applied to archaeometry for an accurate microstructure mapping of metal and ceramic artifacts.

  2. Manure management and greenhouse gas mitigation techniques : a comparative analysis

    Energy Technology Data Exchange (ETDEWEB)

    Langmead, C.

    2003-09-03

    Alberta is the second largest agricultural producer in Canada, ranking just behind Ontario. Approximately 62 per cent of the province's farm cash receipts are attributable to the livestock industry. Farmers today maintain large numbers of a single animal type. The drivers for more advanced manure management systems include: the trend towards confined feeding operations (CFO) is creating large, concentrated quantities of manure; public perception of CFO; implementation of provincial legislation regulating the expansion and construction of CFO; ratification of the Kyoto Protocol raised interest in the development of improved manure management systems capable of reducing greenhouse gas (GHG) emissions; and rising energy costs. The highest methane emissions factors are found with liquid manure management systems. They contribute more than 80 per cent of the total methane emissions from livestock manure in Alberta. The author identified and analyzed three manure management techniques to mitigate GHG emissions. They were: bio-digesters, gasification systems, and composting. Three recommendations were made to establish a strategy to support emissions offsets and maximize the reduction of methane emissions from the livestock industry. The implementation of bio-digesters, especially for the swine industry, was recommended. It was suggested that a gasification pilot project for poultry manure should be pursued by Climate Change Central. Public outreach programs promoting composting of cattle manure for beef feedlots and older style dairy barns should also be established. 19 refs., 11 tabs., 3 figs.

  3. Modified C-band technique for the analysis of chromosome abnormalities in irradiated human lymphocytes

    Energy Technology Data Exchange (ETDEWEB)

    Nakata, Akifumi; Akiyama, Miho; Yamada, Yuji [Biodosimetry Section, Department of Radiation Dosimetry, Research Center for Radiation Emergency Medicine, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba 263-8555 (Japan); Yoshida, Mitsuaki A., E-mail: myoshida@cc.hirosaki-u.ac.jp [Biodosimetry Section, Department of Radiation Dosimetry, Research Center for Radiation Emergency Medicine, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba 263-8555 (Japan)

    2011-10-15

    A modified C-band technique was developed in order to analyze more accurately dicentric, tricentric, and ring chromosomes in irradiated human peripheral lymphocytes. Instead of the original method relying on treatment with barium hydroxide Ba(OH){sub 2}, C-bands were obtained using a modified form of heat treatment in formamide followed with DAPI staining. This method was tentatively applied to the analysis of dicentric chromosomes in irradiated human lymphocytes to examine its availability. The frequency of dicentric chromosome was almost the same with conventional Giemsa staining and the modified C-band technique. In the analysis using Giemsa staining, it is relatively difficult to identify the centromere on the elongated chromosomes, over-condensed chromosomes, fragment, and acentric ring. However, the modified C-band method used in this study makes it easier to identify the centromere on such chromosomes than with the use of Giemsa staining alone. Thus, the modified C-band method may give more information about the location of the centromere. Therefore, this method may be available and more useful for biological dose estimation due to the analysis of the dicentric chromosome in human lymphocytes exposed to the radiation. Furthermore, this method is simpler and faster than the original C-band protocol and fluorescence in situ hybridization (FISH) method with the centromeric DNA probe. - Highlights: > The dicentric (dic) assay is the most effective for the radiation biodosimetry. > It is important to recognize the centromere of the dic. > We improved a C-band technique based on heat denaturation. > This technique enables the accurate detection of a centromere. > This method may be available and more useful for biological dose estimation.

  4. Emerging techniques for soil analysis via mid-infrared spectroscopy

    Science.gov (United States)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  5. Genome-wide interaction-based association analysis identified multiple new susceptibility Loci for common diseases.

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2011-03-01

    Full Text Available Genome-wide interaction-based association (GWIBA analysis has the potential to identify novel susceptibility loci. These interaction effects could be missed with the prevailing approaches in genome-wide association studies (GWAS. However, no convincing loci have been discovered exclusively from GWIBA methods, and the intensive computation involved is a major barrier for application. Here, we developed a fast, multi-thread/parallel program named "pair-wise interaction-based association mapping" (PIAM for exhaustive two-locus searches. With this program, we performed a complete GWIBA analysis on seven diseases with stringent control for false positives, and we validated the results for three of these diseases. We identified one pair-wise interaction between a previously identified locus, C1orf106, and one new locus, TEC, that was specific for Crohn's disease, with a Bonferroni corrected P < 0.05 (P = 0.039. This interaction was replicated with a pair of proxy linked loci (P = 0.013 on an independent dataset. Five other interactions had corrected P < 0.5. We identified the allelic effect of a locus close to SLC7A13 for coronary artery disease. This was replicated with a linked locus on an independent dataset (P = 1.09 × 10⁻⁷. Through a local validation analysis that evaluated association signals, rather than locus-based associations, we found that several other regions showed association/interaction signals with nominal P < 0.05. In conclusion, this study demonstrated that the GWIBA approach was successful for identifying novel loci, and the results provide new insights into the genetic architecture of common diseases. In addition, our PIAM program was capable of handling very large GWAS datasets that are likely to be produced in the future.

  6. Analysis of regulatory protease sequences identified through bioinformatic data mining of the Schistosoma mansoni genome

    Directory of Open Access Journals (Sweden)

    Minchella Dennis J

    2009-10-01

    Full Text Available Abstract Background New chemotherapeutic agents against Schistosoma mansoni, an etiological agent of human schistosomiasis, are a priority due to the emerging drug resistance and the inability of current drug treatments to prevent reinfection. Proteases have been under scrutiny as targets of immunological or chemotherapeutic anti-Schistosoma agents because of their vital role in many stages of the parasitic life cycle. Function has been established for only a handful of identified S. mansoni proteases, and the vast majority of these are the digestive proteases; very few of the conserved classes of regulatory proteases have been identified from Schistosoma species, despite their vital role in numerous cellular processes. To that end, we identified protease protein coding genes from the S. mansoni genome project and EST library. Results We identified 255 protease sequences from five catalytic classes using predicted proteins of the S. mansoni genome. The vast majority of these show significant similarity to proteins in KEGG and the Conserved Domain Database. Proteases include calpains, caspases, cytosolic and mitochondrial signal peptidases, proteases that interact with ubiquitin and ubiquitin-like molecules, and proteases that perform regulated intramembrane proteolysis. Comparative analysis of classes of important regulatory proteases find conserved active site domains, and where appropriate, signal peptides and transmembrane helices. Phylogenetic analysis provides support for inferring functional divergence among regulatory aspartic, cysteine, and serine proteases. Conclusion Numerous proteases are identified for the first time in S. mansoni. We characterized important regulatory proteases and focus analysis on these proteases to complement the growing knowledge base of digestive proteases. This work provides a foundation for expanding knowledge of proteases in Schistosoma species and examining their diverse function and potential as targets

  7. Patent Network Analysis and Quadratic Assignment Procedures to Identify the Convergence of Robot Technologies

    Science.gov (United States)

    Lee, Woo Jin; Lee, Won Kyung

    2016-01-01

    Because of the remarkable developments in robotics in recent years, technological convergence has been active in this area. We focused on finding patterns of convergence within robot technology using network analysis of patents in both the USPTO and KIPO. To identify the variables that affect convergence, we used quadratic assignment procedures (QAP). From our analysis, we observed the patent network ecology related to convergence and found technologies that have great potential to converge with other robotics technologies. The results of our study are expected to contribute to setting up convergence based R&D policies for robotics, which can lead new innovation. PMID:27764196

  8. Identifying barriers to patient acceptance of active surveillance: content analysis of online patient communications.

    Directory of Open Access Journals (Sweden)

    Mark V Mishra

    Full Text Available OBJECTIVES: Qualitative research aimed at identifying patient acceptance of active surveillance (AS has been identified as a public health research priority. The primary objective of this study was to determine if analysis of a large-sample of anonymous internet conversations (ICs could be utilized to identify unmet public needs regarding AS. METHODS: English-language ICs regarding prostate cancer (PC treatment with AS from 2002-12 were identified using a novel internet search methodology. Web spiders were developed to mine, aggregate, and analyze content from the world-wide-web for ICs centered on AS. Collection of ICs was not restricted to any specific geographic region of origin. NLP was used to evaluate content and perform a sentiment analysis. Conversations were scored as positive, negative, or neutral. A sentiment index (SI was subsequently calculated according to the following formula to compare temporal trends in public sentiment towards AS: [(# Positive IC/#Total IC-(#Negative IC/#Total IC x 100]. RESULTS: A total of 464 ICs were identified. Sentiment increased from -13 to +2 over the study period. The increase sentiment has been driven by increased patient emphasis on quality-of-life factors and endorsement of AS by national medical organizations. Unmet needs identified in these ICs include: a gap between quantitative data regarding long-term outcomes with AS vs. conventional treatments, desire for treatment information from an unbiased specialist, and absence of public role models managed with AS. CONCLUSIONS: This study demonstrates the potential utility of online patient communications to provide insight into patient preferences and decision-making. Based on our findings, we recommend that multidisciplinary clinics consider including an unbiased specialist to present treatment options and that future decision tools for AS include quantitative data regarding outcomes after AS.

  9. Factor Analysis of the DePaul Symptom Questionnaire: Identifying Core Domains

    OpenAIRE

    Jason, Leonard A.; Sunnquist, Madison; Brown, Abigail; Furst, Jacob; Cid, Marjoe; Farietta, Jillianna; Kot, Bobby; Bloomer, Craig; Nicholson, Laura; Williams, Yolonda; Jantke, Rachel; Newton, Julia L.; Strand, Elin Bolle

    2015-01-01

    The present study attempted to identify critical symptom domains of individuals with Myalgic Encephalomyelitis (ME) and chronic fatigue syndrome (CFS). Using patient and control samples collected in the United States, Great Britain, and Norway, exploratory factor analysis (EFA) was used to establish the underlying factor structure of ME and CFS symptoms. The EFA suggested a four-factor solution: post-exertional malaise, cognitive dysfunction, sleep difficulties, and a combined factor consisti...

  10. Emergent team roles in organizational meetings: Identifying communication patterns via cluster analysis.

    OpenAIRE

    Lehmann-Willenbrock, N.K.; Beck, S.J.; Kauffeld, S.

    2016-01-01

    Previous team role taxonomies have largely relied on self-report data, focused on functional roles, and described individual predispositions or personality traits. Instead, this study takes a communicative approach and proposes that team roles are produced, shaped, and sustained in communicative behaviors. To identify team roles communicatively, 59 regular organizational meetings were videotaped and analyzed. Cluster analysis revealed five emergent roles: the solution seeker, the problem anal...

  11. Integrated Analysis for Identifying Radix Astragali and Its Adulterants Based on DNA Barcoding

    OpenAIRE

    Sihao Zheng; Dewang Liu; Weiguang Ren; Juan Fu; Linfang Huang; Shilin Chen

    2014-01-01

    Radix Astragali is a popular herb used in traditional Chinese medicine for its proimmune and antidiabetic properties. However, methods are needed to help distinguish Radix Astragali from its varied adulterants. DNA barcoding is a widely applicable molecular method used to identify medicinal plants. Yet, its use has been hampered by genetic distance, base variation, and limitations of the bio-NJ tree. Herein, we report the validation of an integrated analysis method for plant species identific...

  12. Genome-wide meta-analysis identifies new susceptibility loci for migraine

    DEFF Research Database (Denmark)

    Anttila, Verneri; Winsvold, Bendik S; Gormley, Padhraig

    2013-01-01

    Migraine is the most common brain disorder, affecting approximately 14% of the adult population, but its molecular mechanisms are poorly understood. We report the results of a meta-analysis across 29 genome-wide association studies, including a total of 23,285 individuals with migraine (cases......) and 95,425 population-matched controls. We identified 12 loci associated with migraine susceptibility (P

  13. Unscented Kalman filter with parameter identifiability analysis for the estimation of multiple parameters in kinetic models

    OpenAIRE

    Baker Syed; Poskar C; Junker Björn

    2011-01-01

    Abstract In systems biology, experimentally measured parameters are not always available, necessitating the use of computationally based parameter estimation. In order to rely on estimated parameters, it is critical to first determine which parameters can be estimated for a given model and measurement set. This is done with parameter identifiability analysis. A kinetic model of the sucrose accumulation in the sugar cane culm tissue developed by Rohwer et al. was taken as a test case model. Wh...

  14. Parallel analysis of tagged deletion mutants efficiently identifies genes involved in endoplasmic reticulum biogenesis.

    Science.gov (United States)

    Wright, Robin; Parrish, Mark L; Cadera, Emily; Larson, Lynnelle; Matson, Clinton K; Garrett-Engele, Philip; Armour, Chris; Lum, Pek Yee; Shoemaker, Daniel D

    2003-07-30

    Increased levels of HMG-CoA reductase induce cell type- and isozyme-specific proliferation of the endoplasmic reticulum. In yeast, the ER proliferations induced by Hmg1p consist of nuclear-associated stacks of smooth ER membranes known as karmellae. To identify genes required for karmellae assembly, we compared the composition of populations of homozygous diploid S. cerevisiae deletion mutants following 20 generations of growth with and without karmellae. Using an initial population of 1,557 deletion mutants, 120 potential mutants were identified as a result of three independent experiments. Each experiment produced a largely non-overlapping set of potential mutants, suggesting that differences in specific growth conditions could be used to maximize the comprehensiveness of similar parallel analysis screens. Only two genes, UBC7 and YAL011W, were identified in all three experiments. Subsequent analysis of individual mutant strains confirmed that each experiment was identifying valid mutations, based on the mutant's sensitivity to elevated HMG-CoA reductase and inability to assemble normal karmellae. The largest class of HMG-CoA reductase-sensitive mutations was a subset of genes that are involved in chromatin structure and transcriptional regulation, suggesting that karmellae assembly requires changes in transcription or that the presence of karmellae may interfere with normal transcriptional regulation.

  15. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  16. MEASURING THE LEANNESS OF SUPPLIERS USING PRINCIPAL COMPONENT ANALYSIS TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Zare Mehrjerdi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: A technique that helps management to reduce costs and improve quality is ‘lean supply chain management’, which focuses on the elimination of all wastes in every stage of the supply chain and is derived from ‘agile production’. This research aims to assess and rank the suppliers in an auto industry, based upon the concept of ‘production leanness’. The focus of this research is on the suppliers of a company called Touse-Omron Naein. We have examined the literature about leanness, and classified its criteria into ten dimensions and 76 factors. A questionnaire was used to collect the data, and the suppliers were ranked using the principal component analysis (PCA technique.

    AFRIKAANSE OPSOMMING: Lenige voorsieningsbestuur (“lean supply chain management” is ’n tegniek wat bestuur in staat stel om koste te verminder en gehalte te verbeter. Dit fokus op die vermindering van vermorsing op elke stadium van die voorsieningsketting en word afgelei van ratse vervaardiging (“agile production”. Hierdie navorsing poog om leweransiers in ’n motorbedryf te beoordeel aan die hand van die konsep van vervaardigingslenigheid (“production leanness”. Die navorsing fokus op leweransiers van ’n maatskappy genaamd Touse-Omron Naein. ’n Literatuurstudie aangaande lenigheid het gelei tot die klassifikasie van kriteria in tien dimensies en 76 faktore. ’n Vraelys is gebruik om die data te versamel en die leweransiers is in rangvolgorde geplaas aan die hand van die PCA-tegniek.

  17. Hot spot analysis applied to identify ecosystem services potential in Lithuania

    Science.gov (United States)

    Pereira, Paulo; Depellegrin, Daniel; Misiune, Ieva

    2016-04-01

    Hot spot analysis are very useful to identify areas with similar characteristics. This is important for a sustainable use of the territory, since we can identify areas that need to be protected, or restored. This is a great advantage in terms of land use planning and management, since we can allocate resources, reduce the economical costs and do a better intervention in the landscape. Ecosystem services (ES) are different according land use. Since landscape is very heterogeneous, it is of major importance understand their spatial pattern and where are located the areas that provide better ES and the others that provide less services. The objective of this work is to use hot-spot analysis to identify areas with the most valuable ES in Lithuania. CORINE land-cover (CLC) of 2006 was used as the main spatial information. This classification uses a grid of 100 m resolution and extracted a total of 31 land use types. ES ranking was carried out based on expert knowledge. They were asked to evaluate the ES potential of each different CLC from 0 (no potential) to 5 (very high potential). Hot spot analysis were evaluated using the Getis-ord test, which identifies cluster analysis available in ArcGIS toolbox. This tool identifies areas with significantly high low values and significant high values at a p level of 0.05. In this work we used hot spot analysis to assess the distribution of providing, regulating cultural and total (sum of the previous 3) ES. The Z value calculated from Getis-ord was used to statistical analysis to access the clusters of providing, regulating cultural and total ES. ES with high Z value show that they have a high number of cluster areas with high potential of ES. The results showed that the Z-score was significantly different among services (Kruskal Wallis ANOVA =834. 607, pcultural (0.080±1.979) and regulating (0.076±1.961). These results suggested that providing services are more clustered than the remaining. Ecosystem Services Z score were

  18. Driving forces of change in environmental indicators an analysis based on divisia index decomposition techniques

    CERN Document Server

    González, Paula Fernández; Presno, Mª José

    2014-01-01

    This book addresses several index decomposition analysis methods to assess progress made by EU countries in the last decade in relation to energy and climate change concerns. Several applications of these techniques are carried out in order to decompose changes in both energy and environmental aggregates. In addition to this, a new methodology based on classical spline approximations is introduced, which provides useful mathematical and statistical properties. Once a suitable set of determinant factors has been identified, these decomposition methods allow the researcher to quantify the respec

  19. Combination of meta-analysis and graph clustering to identify prognostic markers of ESCC

    Directory of Open Access Journals (Sweden)

    Hongyun Gao

    2012-01-01

    Full Text Available Esophageal squamous cell carcinoma (ESCC is one of the most malignant gastrointestinal cancers and occurs at a high frequency rate in China and other Asian countries. Recently, several molecular markers were identified for predicting ESCC. Notwithstanding, additional prognostic markers, with a clear understanding of their underlying roles, are still required. Through bioinformatics, a graph-clustering method by DPClus was used to detect co-expressed modules. The aim was to identify a set of discriminating genes that could be used for predicting ESCC through graph-clustering and GO-term analysis. The results showed that CXCL12, CYP2C9, TGM3, MAL, S100A9, EMP-1 and SPRR3 were highly associated with ESCC development. In our study, all their predicted roles were in line with previous reports, whereby the assumption that a combination of meta-analysis, graph-clustering and GO-term analysis is effective for both identifying differentially expressed genes, and reflecting on their functions in ESCC.

  20. Combination of meta-analysis and graph clustering to identify prognostic markers of ESCC.

    Science.gov (United States)

    Gao, Hongyun; Wang, Lishan; Cui, Shitao; Wang, Mingsong

    2012-04-01

    Esophageal squamous cell carcinoma (ESCC) is one of the most malignant gastrointestinal cancers and occurs at a high frequency rate in China and other Asian countries. Recently, several molecular markers were identified for predicting ESCC. Notwithstanding, additional prognostic markers, with a clear understanding of their underlying roles, are still required. Through bioinformatics, a graph-clustering method by DPClus was used to detect co-expressed modules. The aim was to identify a set of discriminating genes that could be used for predicting ESCC through graph-clustering and GO-term analysis. The results showed that CXCL12, CYP2C9, TGM3, MAL, S100A9, EMP-1 and SPRR3 were highly associated with ESCC development. In our study, all their predicted roles were in line with previous reports, whereby the assumption that a combination of meta-analysis, graph-clustering and GO-term analysis is effective for both identifying differentially expressed genes, and reflecting on their functions in ESCC.

  1. Effective Boolean dynamics analysis to identify functionally important genes in large-scale signaling networks.

    Science.gov (United States)

    Trinh, Hung-Cuong; Kwon, Yung-Keun

    2015-11-01

    Efficiently identifying functionally important genes in order to understand the minimal requirements of normal cellular development is challenging. To this end, a variety of structural measures have been proposed and their effectiveness has been investigated in recent literature; however, few studies have shown the effectiveness of dynamics-based measures. This led us to investigate a dynamic measure to identify functionally important genes, and the effectiveness of which was verified through application on two large-scale human signaling networks. We specifically consider Boolean sensitivity-based dynamics against an update-rule perturbation (BSU) as a dynamic measure. Through investigations on two large-scale human signaling networks, we found that genes with relatively high BSU values show slower evolutionary rate and higher proportions of essential genes and drug targets than other genes. Gene-ontology analysis showed clear differences between the former and latter groups of genes. Furthermore, we compare the identification accuracies of essential genes and drug targets via BSU and five well-known structural measures. Although BSU did not always show the best performance, it effectively identified the putative set of genes, which is significantly different from the results obtained via the structural measures. Most interestingly, BSU showed the highest synergy effect in identifying the functionally important genes in conjunction with other measures. Our results imply that Boolean-sensitive dynamics can be used as a measure to effectively identify functionally important genes in signaling networks.

  2. Association of two techniques of frontal sinus radiographic analysis for human identification

    Directory of Open Access Journals (Sweden)

    Rhonan Ferreira da SILVA

    2009-09-01

    Full Text Available Introduction: The analysis of images with human identificationpurpose is a routine activity in the departments of forensic medicine, especially when is necessary to identify burned bodies, skeletal remains or corpses in advanced stage of decomposition. Case report: The feasibility and reliability of the analysis of the morphoradiographic image of the frontal sinus is showed, displayed in a posteroanterior (PA radiography of skull produced in life compared to another produced post-death. Conclusion: The results obtained in the radiographic comparison through the association of two different techniques of analysis of the frontal sinus allowed a positive correlation of the identity of the disappeared person with the body in an advanced stage of decomposition.

  3. Accuracy of qualitative analysis for assessment of skilled baseball pitching technique.

    Science.gov (United States)

    Nicholls, Rochelle; Fleisig, Glenn; Elliott, Bruce; Lyman, Stephen; Osinski, Edmund

    2003-07-01

    Baseball pitching must be performed with correct technique if injuries are to be avoided and performance maximized. High-speed video analysis is accepted as the most accurate and objective method for evaluation of baseball pitching mechanics. The aim of this research was to develop an equivalent qualitative analysis method for use with standard video equipment. A qualitative analysis protocol (QAP) was developed for 24 kinematic variables identified as important to pitching performance. Twenty male baseball pitchers were videotaped using 60 Hz camcorders, and their technique evaluated using the QAP, by two independent raters. Each pitcher was also assessed using a 6-camera 200 Hz Motion Analysis system (MAS). Four QAP variables (22%) showed significant similarity with MAS results. Inter-rater reliability showed agreement on 33% of QAP variables. It was concluded that a complete and accurate profile of an athlete's pitching mechanics cannot be made using the QAP in its current form, but it is possible such simple forms of biomechanical analysis could yield accurate results before 3-D methods become obligatory.

  4. Characteristics of identifying linear dynamic models from impulse response data using Prony analysis

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, D.J.

    1992-12-01

    The purpose of the study was to investigate the characteristics of fitting linear dynamic models to the impulse response of oscillatory dynamic systems using Prony analysis. Many dynamic systems exhibit oscillatory responses with multiple modes of oscillations. Although the underlying dynamics of such systems are often nonlinear, it is frequently possible and very useful to represent the system operating about some set point with a linear model. Derivation of such linear models can be done using two basic approaches: model the system using theoretical derivations and some linearization method such as a Taylor series expansion; or use a curve-fitting technique to optimally fit a linear model to specified system response data. Prony analysis belongs to the second class of system modeling because it is a method of fitting a linear model to the impulse response of a dynamic system. Its parallel formulation inherently makes it well suited for fitting models to oscillatory system data. Such oscillatory dynamic effects occur in large synchronous-generator-based power systems in the form of electromechanical oscillations. To study and characterize these oscillatory dynamics, BPA has developed computer codes to analyze system data using Prony analysis. The objective of this study was to develop a highly detailed understanding of the properties of using Prony analysis to fit models to systems with characteristics often encountered in power systems. This understanding was then extended to develop general ``rules-of-thumb`` for using Prony analysis. The general characteristics were investigated by performing fits to data from known linear models under controlled conditions. The conditions studied include various mathematical solution techniques; different parent system configurations; and a large variety of underlying noise characteristics.

  5. Characteristics of identifying linear dynamic models from impulse response data using Prony analysis

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, D.J.

    1992-12-01

    The purpose of the study was to investigate the characteristics of fitting linear dynamic models to the impulse response of oscillatory dynamic systems using Prony analysis. Many dynamic systems exhibit oscillatory responses with multiple modes of oscillations. Although the underlying dynamics of such systems are often nonlinear, it is frequently possible and very useful to represent the system operating about some set point with a linear model. Derivation of such linear models can be done using two basic approaches: model the system using theoretical derivations and some linearization method such as a Taylor series expansion; or use a curve-fitting technique to optimally fit a linear model to specified system response data. Prony analysis belongs to the second class of system modeling because it is a method of fitting a linear model to the impulse response of a dynamic system. Its parallel formulation inherently makes it well suited for fitting models to oscillatory system data. Such oscillatory dynamic effects occur in large synchronous-generator-based power systems in the form of electromechanical oscillations. To study and characterize these oscillatory dynamics, BPA has developed computer codes to analyze system data using Prony analysis. The objective of this study was to develop a highly detailed understanding of the properties of using Prony analysis to fit models to systems with characteristics often encountered in power systems. This understanding was then extended to develop general rules-of-thumb'' for using Prony analysis. The general characteristics were investigated by performing fits to data from known linear models under controlled conditions. The conditions studied include various mathematical solution techniques; different parent system configurations; and a large variety of underlying noise characteristics.

  6. Vibration impact acoustic emission technique for identification and analysis of defects in carbon steel tubes: Part A Statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halim, Zakiah Abd [Universiti Teknikal Malaysia Melaka (Malaysia); Jamaludin, Nordin; Junaidi, Syarif [Faculty of Engineering and Built, Universiti Kebangsaan Malaysia, Bangi (Malaysia); Yahya, Syed Yusainee Syed [Universiti Teknologi MARA, Shah Alam (Malaysia)

    2015-04-15

    Current steel tubes inspection techniques are invasive, and the interpretation and evaluation of inspection results are manually done by skilled personnel. This paper presents a statistical analysis of high frequency stress wave signals captured from a newly developed noninvasive, non-destructive tube inspection technique known as the vibration impact acoustic emission (VIAE) technique. Acoustic emission (AE) signals have been introduced into the ASTM A179 seamless steel tubes using an impact hammer, and the AE wave propagation was captured using an AE sensor. Specifically, a healthy steel tube as the reference tube and four steel tubes with through-hole artificial defect at different locations were used in this study. The AE features extracted from the captured signals are rise time, peak amplitude, duration and count. The VIAE technique also analysed the AE signals using statistical features such as root mean square (r.m.s.), energy, and crest factor. It was evident that duration, count, r.m.s., energy and crest factor could be used to automatically identify the presence of defect in carbon steel tubes using AE signals captured using the non-invasive VIAE technique.

  7. Efficient behavior of photosynthetic organelles via Pareto optimality, identifiability, and sensitivity analysis.

    Science.gov (United States)

    Carapezza, Giovanni; Umeton, Renato; Costanza, Jole; Angione, Claudio; Stracquadanio, Giovanni; Papini, Alessio; Lió, Pietro; Nicosia, Giuseppe

    2013-05-17

    In this work, we develop methodologies for analyzing and cross comparing metabolic models. We investigate three important metabolic networks to discuss the complexity of biological organization of organisms, modeling, and system properties. In particular, we analyze these metabolic networks because of their biotechnological and basic science importance: the photosynthetic carbon metabolism in a general leaf, the Rhodobacter spheroides bacterium, and the Chlamydomonas reinhardtii alga. We adopt single- and multi-objective optimization algorithms to maximize the CO 2 uptake rate and the production of metabolites of industrial interest or for ecological purposes. We focus both on the level of genes (e.g., finding genetic manipulations to increase the production of one or more metabolites) and on finding concentration enzymes for improving the CO 2 consumption. We find that R. spheroides is able to absorb an amount of CO 2 until 57.452 mmol h (-1) gDW (-1) , while C. reinhardtii obtains a maximum of 6.7331. We report that the Pareto front analysis proves extremely useful to compare different organisms, as well as providing the possibility to investigate them with the same framework. By using the sensitivity and robustness analysis, our framework identifies the most sensitive and fragile components of the biological systems we take into account, allowing us to compare their models. We adopt the identifiability analysis to detect functional relations among enzymes; we observe that RuBisCO, GAPDH, and FBPase belong to the same functional group, as suggested also by the sensitivity analysis.

  8. Gene expression signature analysis identifies vorinostat as a candidate therapy for gastric cancer.

    Directory of Open Access Journals (Sweden)

    Sofie Claerhout

    Full Text Available BACKGROUND: Gastric cancer continues to be one of the deadliest cancers in the world and therefore identification of new drugs targeting this type of cancer is thus of significant importance. The purpose of this study was to identify and validate a therapeutic agent which might improve the outcomes for gastric cancer patients in the future. METHODOLOGY/PRINCIPAL FINDINGS: Using microarray technology, we generated a gene expression profile of human gastric cancer-specific genes from human gastric cancer tissue samples. We used this profile in the Broad Institute's Connectivity Map analysis to identify candidate therapeutic compounds for gastric cancer. We found the histone deacetylase inhibitor vorinostat as the lead compound and thus a potential therapeutic drug for gastric cancer. Vorinostat induced both apoptosis and autophagy in gastric cancer cell lines. Pharmacological and genetic inhibition of autophagy however, increased the therapeutic efficacy of vorinostat, indicating that a combination of vorinostat with autophagy inhibitors may therapeutically be more beneficial. Moreover, gene expression analysis of gastric cancer identified a collection of genes (ITGB5, TYMS, MYB, APOC1, CBX5, PLA2G2A, and KIF20A whose expression was elevated in gastric tumor tissue and downregulated more than 2-fold by vorinostat treatment in gastric cancer cell lines. In contrast, SCGB2A1, TCN1, CFD, APLP1, and NQO1 manifested a reversed pattern. CONCLUSIONS/SIGNIFICANCE: We showed that analysis of gene expression signature may represent an emerging approach to discover therapeutic agents for gastric cancer, such as vorinostat. The observation of altered gene expression after vorinostat treatment may provide the clue to identify the molecular mechanism of vorinostat and those patients likely to benefit from vorinostat treatment.

  9. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  10. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N;

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  11. Identifying training deficiencies in military pilots by applying the human factors analysis and classification system.

    Science.gov (United States)

    Li, Wen-Chin; Harris, Don

    2013-01-01

    Without accurate analysis, it is difficult to identify training needs and develop the content of training programs required for preventing aviation accidents. The human factors analysis and classification system (HFACS) is based on Reason's system-wide model of human error. In this study, 523 accidents from the Republic of China Air Force were analyzed in which 1762 human errors were categorized. The results of the analysis showed that errors of judgment and poor decision-making were commonly reported amongst pilots. As a result, it was concluded that there was a need for military pilots to be trained specifically in making decisions in tactical environments. However, application of HFACS also allowed the identification of systemic training deficiencies within the organization further contributing to the accidents observed.

  12. Sentiment Analysis of Twitter tweets using supervised classification technique

    Directory of Open Access Journals (Sweden)

    Pranav Waykar

    2016-05-01

    Full Text Available Making use of social media for analyzing the perceptions of the masses over a product, event or a person has gained momentum in recent times. Out of a wide array of social networks, we chose Twitter for our analysis as the opinions expressed their, are concise and bear a distinctive polarity. Here, we collect the most recent tweets on users' area of interest and analyze them. The extracted tweets are then segregated as positive, negative and neutral. We do the classification in following manner: collect the tweets using Twitter API; then we process the collected tweets to convert all letters to lowercase, eliminate special characters etc. which makes the classification more efficient; the processed tweets are classified using a supervised classification technique. We make use of Naive Bayes classifier to segregate the tweets as positive, negative and neutral. We use a set of sample tweets to train the classifier. The percentage of the tweets in each category is then computed and the result is represented graphically. The result can be used further to gain an insight into the views of the people using Twitter about a particular topic that is being searched by the user. It can help corporate houses devise strategies on the basis of the popularity of their product among the masses. It may help the consumers to make informed choices based on the general sentiment expressed by the Twitter users on a product

  13. An evaluation of wind turbine blade cross section analysis techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Paquette, Joshua A.; Griffith, Daniel Todd; Laird, Daniel L.; Resor, Brian Ray

    2010-03-01

    The blades of a modern wind turbine are critical components central to capturing and transmitting most of the load experienced by the system. They are complex structural items composed of many layers of fiber and resin composite material and typically, one or more shear webs. Large turbine blades being developed today are beyond the point of effective trial-and-error design of the past and design for reliability is always extremely important. Section analysis tools are used to reduce the three-dimensional continuum blade structure to a simpler beam representation for use in system response calculations to support full system design and certification. One model simplification approach is to analyze the two-dimensional blade cross sections to determine the properties for the beam. Another technique is to determine beam properties using static deflections of a full three-dimensional finite element model of a blade. This paper provides insight into discrepancies observed in outputs from each approach. Simple two-dimensional geometries and three-dimensional blade models are analyzed in this investigation. Finally, a subset of computational and experimental section properties for a full turbine blade are compared.

  14. Seismic margin analysis technique for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed.

  15. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  16. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Balaraman Kumar

    2010-06-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.

  17. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    Science.gov (United States)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  18. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  19. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  20. Root-cause analysis and health failure mode and effect analysis: two leading techniques in health care quality assessment.

    Science.gov (United States)

    Shaqdan, Khalid; Aran, Shima; Daftari Besheli, Laleh; Abujudeh, Hani

    2014-06-01

    In this review article, the authors provide a detailed series of guidelines for effectively performing root-cause analysis (RCA) and health failure mode and effect analysis (HFMEA). RCA is a retrospective approach used to ascertain the "root cause" of a problem that has already occurred, whereas HFMEA is a prospective risk assessment tool whose aim is to recognize risks to patient safety. RCA and HFMEA are used for the prevention of errors or recurring errors to create a safer workplace, maintain high standards in health care quality, and incorporate time-saving and cost-saving modifications to favorably affect the patient care environment. The principles and techniques provided here should allow reviewers to better understand the features of RCA and HFMEA and how to apply these processes appropriately. These principles include how to organize a team, identify root causes, seed out proximate causes, graphically describe the process, conduct a hazard analysis, and develop and implement potential action plans.

  1. A Comprehensive Gene Expression Meta-analysis Identifies Novel Immune Signatures in Rheumatoid Arthritis Patients

    Science.gov (United States)

    Afroz, Sumbul; Giddaluru, Jeevan; Vishwakarma, Sandeep; Naz, Saima; Khan, Aleem Ahmed; Khan, Nooruddin

    2017-01-01

    Rheumatoid arthritis (RA), a symmetric polyarticular arthritis, has long been feared as one of the most disabling forms of arthritis. Identification of gene signatures associated with RA onset and progression would lead toward development of novel diagnostics and therapeutic interventions. This study was undertaken to identify unique gene signatures of RA patients through large-scale meta-profiling of a diverse collection of gene expression data sets. We carried out a meta-analysis of 8 publicly available RA patients’ (107 RA patients and 76 healthy controls) gene expression data sets and further validated a few meta-signatures in RA patients through quantitative real-time PCR (RT-qPCR). We identified a robust meta-profile comprising 33 differentially expressed genes, which were consistently and significantly expressed across all the data sets. Our meta-analysis unearthed upregulation of a few novel gene signatures including PLCG2, HLA-DOB, HLA-F, EIF4E2, and CYFIP2, which were validated in peripheral blood mononuclear cell samples of RA patients. Further, functional and pathway enrichment analysis reveals perturbation of several meta-genes involved in signaling pathways pertaining to inflammation, antigen presentation, hypoxia, and apoptosis during RA. Additionally, PLCG2 (phospholipase Cγ2) popped out as a novel meta-gene involved in most of the pathways relevant to RA including inflammasome activation, platelet aggregation, and activation, thereby suggesting PLCG2 as a potential therapeutic target for controlling excessive inflammation during RA. In conclusion, these findings highlight the utility of meta-analysis approach in identifying novel gene signatures that might provide mechanistic insights into disease onset, progression and possibly lead toward the development of better diagnostic and therapeutic interventions against RA. PMID:28210261

  2. A Comprehensive Gene Expression Meta-analysis Identifies Novel Immune Signatures in Rheumatoid Arthritis Patients.

    Science.gov (United States)

    Afroz, Sumbul; Giddaluru, Jeevan; Vishwakarma, Sandeep; Naz, Saima; Khan, Aleem Ahmed; Khan, Nooruddin

    2017-01-01

    Rheumatoid arthritis (RA), a symmetric polyarticular arthritis, has long been feared as one of the most disabling forms of arthritis. Identification of gene signatures associated with RA onset and progression would lead toward development of novel diagnostics and therapeutic interventions. This study was undertaken to identify unique gene signatures of RA patients through large-scale meta-profiling of a diverse collection of gene expression data sets. We carried out a meta-analysis of 8 publicly available RA patients' (107 RA patients and 76 healthy controls) gene expression data sets and further validated a few meta-signatures in RA patients through quantitative real-time PCR (RT-qPCR). We identified a robust meta-profile comprising 33 differentially expressed genes, which were consistently and significantly expressed across all the data sets. Our meta-analysis unearthed upregulation of a few novel gene signatures including PLCG2, HLA-DOB, HLA-F, EIF4E2, and CYFIP2, which were validated in peripheral blood mononuclear cell samples of RA patients. Further, functional and pathway enrichment analysis reveals perturbation of several meta-genes involved in signaling pathways pertaining to inflammation, antigen presentation, hypoxia, and apoptosis during RA. Additionally, PLCG2 (phospholipase Cγ2) popped out as a novel meta-gene involved in most of the pathways relevant to RA including inflammasome activation, platelet aggregation, and activation, thereby suggesting PLCG2 as a potential therapeutic target for controlling excessive inflammation during RA. In conclusion, these findings highlight the utility of meta-analysis approach in identifying novel gene signatures that might provide mechanistic insights into disease onset, progression and possibly lead toward the development of better diagnostic and therapeutic interventions against RA.

  3. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    Science.gov (United States)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  4. Differentially Variable Component Analysis (dVCA): Identifying Multiple Evoked Components using Trial-to-Trial Variability

    Science.gov (United States)

    Knuth, Kevin H.; Shah, Ankoor S.; Truccolo, Wilson; Ding, Ming-Zhou; Bressler, Steven L.; Schroeder, Charles E.

    2003-01-01

    Electric potentials and magnetic fields generated by ensembles of synchronously active neurons in response to external stimuli provide information essential to understanding the processes underlying cognitive and sensorimotor activity. Interpreting recordings of these potentials and fields is difficult as each detector records signals simultaneously generated by various regions throughout the brain. We introduce the differentially Variable Component Analysis (dVCA) algorithm, which relies on trial-to-trial variability in response amplitude and latency to identify multiple components. Using simulations we evaluate the importance of response variability to component identification, the robustness of dVCA to noise, and its ability to characterize single-trial data. Finally, we evaluate the technique using visually evoked field potentials recorded at incremental depths across the layers of cortical area VI, in an awake, behaving macaque monkey.

  5. A new model to identify the productivity of theses in terms of articles using co-word analysis

    Directory of Open Access Journals (Sweden)

    Mery Piedad Zamudio Igami

    2014-01-01

    Full Text Available A thesis defense should be considered as not the end but the starting point for scientific communication flow. How many articles truly extend doctoral research? This article proposes a new model to automatically identify the productivity of theses in terms of article publications. We evaluate the use of the co-word analysis technique to establish relationships among 401 doctoral theses and 2,211 articles journal articles published by students in a graduate program at a Brazilian National Nuclear Research Institution (IPEN-CNEN/SP.To identify the relationship between a thesis and an article published by the same author, we used co-descriptor pairs from a controlled vocabulary. To validate the proposed model, a survey was applied to a random sample of theses authors (n = 128, response rate of 79%, thus establishing a minimum threshold of three coincident co-descriptors to identify the relationship between theses and articles. The agreement level between an author′s opinion and the automatic method was 86.9%, with a sampling error of 7.36%, which indicates an acceptable level of accuracy. Differences between the related or nonrelated distributions of articles were also demonstrated, as was a reduction in the median lag time to publication and the supervisor′s influence on student productivity.

  6. Shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity.

    Directory of Open Access Journals (Sweden)

    J R Managbanag

    Full Text Available BACKGROUND: Identification of genes that modulate longevity is a major focus of aging-related research and an area of intense public interest. In addition to facilitating an improved understanding of the basic mechanisms of aging, such genes represent potential targets for therapeutic intervention in multiple age-associated diseases, including cancer, heart disease, diabetes, and neurodegenerative disorders. To date, however, targeted efforts at identifying longevity-associated genes have been limited by a lack of predictive power, and useful algorithms for candidate gene-identification have also been lacking. METHODOLOGY/PRINCIPAL FINDINGS: We have utilized a shortest-path network analysis to identify novel genes that modulate longevity in Saccharomyces cerevisiae. Based on a set of previously reported genes associated with increased life span, we applied a shortest-path network algorithm to a pre-existing protein-protein interaction dataset in order to construct a shortest-path longevity network. To validate this network, the replicative aging potential of 88 single-gene deletion strains corresponding to predicted components of the shortest-path longevity network was determined. Here we report that the single-gene deletion strains identified by our shortest-path longevity analysis are significantly enriched for mutations conferring either increased or decreased replicative life span, relative to a randomly selected set of 564 single-gene deletion strains or to the current data set available for the entire haploid deletion collection. Further, we report the identification of previously unknown longevity genes, several of which function in a conserved longevity pathway believed to mediate life span extension in response to dietary restriction. CONCLUSIONS/SIGNIFICANCE: This work demonstrates that shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity and represents the first application of

  7. Integrated Analysis for Identifying Radix Astragali and Its Adulterants Based on DNA Barcoding

    Directory of Open Access Journals (Sweden)

    Sihao Zheng

    2014-01-01

    Full Text Available Radix Astragali is a popular herb used in traditional Chinese medicine for its proimmune and antidiabetic properties. However, methods are needed to help distinguish Radix Astragali from its varied adulterants. DNA barcoding is a widely applicable molecular method used to identify medicinal plants. Yet, its use has been hampered by genetic distance, base variation, and limitations of the bio-NJ tree. Herein, we report the validation of an integrated analysis method for plant species identification using DNA barcoding that focuses on genetic distance, identification efficiency, inter- and intraspecific variation, and barcoding gap. We collected 478 sequences from six candidate DNA barcodes (ITS2, ITS, psbA-trnH, rbcL, matK, and COI from 29 species of Radix Astragali and adulterants. The internal transcribed spacer (ITS sequence was demonstrated as the optimal barcode for identifying Radix Astragali and its adulterants. This new analysis method is helpful in identifying Radix Astragali and expedites the utilization and data mining of DNA barcoding.

  8. Design Analysis Rules to Identify Proper Noun from Bengali Sentence for Universal Networking language

    Directory of Open Access Journals (Sweden)

    Md. Syeful Islam

    2014-08-01

    Full Text Available Now-a-days hundreds of millions of people of almost all levels of education and attitudes from different country communicate with each other for different purposes and perform their jobs on internet or other communication medium using various languages. Not all people know all language; therefore it is very difficult to communicate or works on various languages. In this situation the computer scientist introduce various inter language translation program (Machine translation. UNL is such kind of inter language translation program. One of the major problem of UNL is identified a name from a sentence, which is relatively simple in English language, because such entities start with a capital letter. In Bangla we do not have concept of small or capital letters. Thus we find difficulties in understanding whether a word is a proper noun or not. Here we have proposed analysis rules to identify proper noun from a sentence and established post converter which translate the name entity from Bangla to UNL. The goal is to make possible Bangla sentence conversion to UNL and vice versa. UNL system prove that the theoretical analysis of our proposed system able to identify proper noun from Bangla sentence and produce relative Universal word for UNL.

  9. Comparative Analysis of Vehicle Make and Model Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Faiza Ayub Syed

    2014-03-01

    Full Text Available Vehicle Make and Model Recognition (VMMR has emerged as a significant element of vision based systems because of its application in access control systems, traffic control and monitoring systems, security systems and surveillance systems, etc. So far a number of techniques have been developed for vehicle recognition. Each technique follows different methodology and classification approaches. The evaluation results highlight the recognition technique with highest accuracy level. In this paper we have pointed out the working of various vehicle make and model recognition techniques and compare these techniques on the basis of methodology, principles, classification approach, classifier and level of recognition After comparing these factors we concluded that Locally Normalized Harris Corner Strengths (LHNS performs best as compared to other techniques. LHNS uses Bayes and K-NN classification approaches for vehicle classification. It extracts information from frontal view of vehicles for vehicle make and model recognition.

  10. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  11. A meta-analysis of 120 246 individuals identifies 18 new loci for fibrinogen concentration.

    Science.gov (United States)

    de Vries, Paul S; Chasman, Daniel I; Sabater-Lleal, Maria; Chen, Ming-Huei; Huffman, Jennifer E; Steri, Maristella; Tang, Weihong; Teumer, Alexander; Marioni, Riccardo E; Grossmann, Vera; Hottenga, Jouke J; Trompet, Stella; Müller-Nurasyid, Martina; Zhao, Jing Hua; Brody, Jennifer A; Kleber, Marcus E; Guo, Xiuqing; Wang, Jie Jin; Auer, Paul L; Attia, John R; Yanek, Lisa R; Ahluwalia, Tarunveer S; Lahti, Jari; Venturini, Cristina; Tanaka, Toshiko; Bielak, Lawrence F; Joshi, Peter K; Rocanin-Arjo, Ares; Kolcic, Ivana; Navarro, Pau; Rose, Lynda M; Oldmeadow, Christopher; Riess, Helene; Mazur, Johanna; Basu, Saonli; Goel, Anuj; Yang, Qiong; Ghanbari, Mohsen; Willemsen, Gonneke; Rumley, Ann; Fiorillo, Edoardo; de Craen, Anton J M; Grotevendt, Anne; Scott, Robert; Taylor, Kent D; Delgado, Graciela E; Yao, Jie; Kifley, Annette; Kooperberg, Charles; Qayyum, Rehan; Lopez, Lorna M; Berentzen, Tina L; Räikkönen, Katri; Mangino, Massimo; Bandinelli, Stefania; Peyser, Patricia A; Wild, Sarah; Trégouët, David-Alexandre; Wright, Alan F; Marten, Jonathan; Zemunik, Tatijana; Morrison, Alanna C; Sennblad, Bengt; Tofler, Geoffrey; de Maat, Moniek P M; de Geus, Eco J C; Lowe, Gordon D; Zoledziewska, Magdalena; Sattar, Naveed; Binder, Harald; Völker, Uwe; Waldenberger, Melanie; Khaw, Kay-Tee; Mcknight, Barbara; Huang, Jie; Jenny, Nancy S; Holliday, Elizabeth G; Qi, Lihong; Mcevoy, Mark G; Becker, Diane M; Starr, John M; Sarin, Antti-Pekka; Hysi, Pirro G; Hernandez, Dena G; Jhun, Min A; Campbell, Harry; Hamsten, Anders; Rivadeneira, Fernando; Mcardle, Wendy L; Slagboom, P Eline; Zeller, Tanja; Koenig, Wolfgang; Psaty, Bruce M; Haritunians, Talin; Liu, Jingmin; Palotie, Aarno; Uitterlinden, André G; Stott, David J; Hofman, Albert; Franco, Oscar H; Polasek, Ozren; Rudan, Igor; Morange, Pierre-Emmanuel; Wilson, James F; Kardia, Sharon L R; Ferrucci, Luigi; Spector, Tim D; Eriksson, Johan G; Hansen, Torben; Deary, Ian J; Becker, Lewis C; Scott, Rodney J; Mitchell, Paul; März, Winfried; Wareham, Nick J; Peters, Annette; Greinacher, Andreas; Wild, Philipp S; Jukema, J Wouter; Boomsma, Dorret I; Hayward, Caroline; Cucca, Francesco; Tracy, Russell; Watkins, Hugh; Reiner, Alex P; Folsom, Aaron R; Ridker, Paul M; O'Donnell, Christopher J; Smith, Nicholas L; Strachan, David P; Dehghan, Abbas

    2016-01-15

    Genome-wide association studies have previously identified 23 genetic loci associated with circulating fibrinogen concentration. These studies used HapMap imputation and did not examine the X-chromosome. 1000 Genomes imputation provides better coverage of uncommon variants, and includes indels. We conducted a genome-wide association analysis of 34 studies imputed to the 1000 Genomes Project reference panel and including ∼120 000 participants of European ancestry (95 806 participants with data on the X-chromosome). Approximately 10.7 million single-nucleotide polymorphisms and 1.2 million indels were examined. We identified 41 genome-wide significant fibrinogen loci; of which, 18 were newly identified. There were no genome-wide significant signals on the X-chromosome. The lead variants of five significant loci were indels. We further identified six additional independent signals, including three rare variants, at two previously characterized loci: FGB and IRF1. Together the 41 loci explain 3% of the variance in plasma fibrinogen concentration.

  12. The systematic functional analysis of plasmodium protein kinases identifies essential regulators of mosquito transmission

    KAUST Repository

    Tewari, Rita

    2010-10-21

    Although eukaryotic protein kinases (ePKs) contribute to many cellular processes, only three Plasmodium falciparum ePKs have thus far been identified as essential for parasite asexual blood stage development. To identify pathways essential for parasite transmission between their mammalian host and mosquito vector, we undertook a systematic functional analysis of ePKs in the genetically tractable rodent parasite Plasmodium berghei. Modeling domain signatures of conventional ePKs identified 66 putative Plasmodium ePKs. Kinomes are highly conserved between Plasmodium species. Using reverse genetics, we show that 23 ePKs are redundant for asexual erythrocytic parasite development in mice. Phenotyping mutants at four life cycle stages in Anopheles stephensi mosquitoes revealed functional clusters of kinases required for sexual development and sporogony. Roles for a putative SR protein kinase (SRPK) in microgamete formation, a conserved regulator of clathrin uncoating (GAK) in ookinete formation, and a likely regulator of energy metabolism (SNF1/KIN) in sporozoite development were identified. 2010 Elsevier Inc.

  13. A study on using texture analysis methods for identifying lobar fissure regions in isotropic CT images.

    Science.gov (United States)

    Wei, Q; Hu, Y

    2009-01-01

    The major hurdle for segmenting lung lobes in computed tomographic (CT) images is to identify fissure regions, which encase lobar fissures. Accurate identification of these regions is difficult due to the variable shape and appearance of the fissures, along with the low contrast and high noise associated with CT images. This paper studies the effectiveness of two texture analysis methods - the gray level co-occurrence matrix (GLCM) and the gray level run length matrix (GLRLM) - in identifying fissure regions from isotropic CT image stacks. To classify GLCM and GLRLM texture features, we applied a feed-forward back-propagation neural network and achieved the best classification accuracy utilizing 16 quantized levels for computing the GLCM and GLRLM texture features and 64 neurons in the input/hidden layers of the neural network. Tested on isotropic CT image stacks of 24 patients with the pathologic lungs, we obtained accuracies of 86% and 87% for identifying fissure regions using the GLCM and GLRLM methods, respectively. These accuracies compare favorably with surgeons/radiologists' accuracy of 80% for identifying fissure regions in clinical settings. This shows promising potential for segmenting lung lobes using the GLCM and GLRLM methods.

  14. Unscented Kalman filter with parameter identifiability analysis for the estimation of multiple parameters in kinetic models

    Directory of Open Access Journals (Sweden)

    Baker Syed

    2011-01-01

    Full Text Available Abstract In systems biology, experimentally measured parameters are not always available, necessitating the use of computationally based parameter estimation. In order to rely on estimated parameters, it is critical to first determine which parameters can be estimated for a given model and measurement set. This is done with parameter identifiability analysis. A kinetic model of the sucrose accumulation in the sugar cane culm tissue developed by Rohwer et al. was taken as a test case model. What differentiates this approach is the integration of an orthogonal-based local identifiability method into the unscented Kalman filter (UKF, rather than using the more common observability-based method which has inherent limitations. It also introduces a variable step size based on the system uncertainty of the UKF during the sensitivity calculation. This method identified 10 out of 12 parameters as identifiable. These ten parameters were estimated using the UKF, which was run 97 times. Throughout the repetitions the UKF proved to be more consistent than the estimation algorithms used for comparison.

  15. Unscented Kalman filter with parameter identifiability analysis for the estimation of multiple parameters in kinetic models.

    Science.gov (United States)

    Baker, Syed Murtuza; Poskar, C Hart; Junker, Björn H

    2011-10-11

    In systems biology, experimentally measured parameters are not always available, necessitating the use of computationally based parameter estimation. In order to rely on estimated parameters, it is critical to first determine which parameters can be estimated for a given model and measurement set. This is done with parameter identifiability analysis. A kinetic model of the sucrose accumulation in the sugar cane culm tissue developed by Rohwer et al. was taken as a test case model. What differentiates this approach is the integration of an orthogonal-based local identifiability method into the unscented Kalman filter (UKF), rather than using the more common observability-based method which has inherent limitations. It also introduces a variable step size based on the system uncertainty of the UKF during the sensitivity calculation. This method identified 10 out of 12 parameters as identifiable. These ten parameters were estimated using the UKF, which was run 97 times. Throughout the repetitions the UKF proved to be more consistent than the estimation algorithms used for comparison.

  16. A computerised morphometric technique for the analysis of intimal hyperplasia.

    OpenAIRE

    Tennant, M; McGeachie, J K

    1991-01-01

    The aim of this study was to design, develop and employ a method for the acquisition of a significant data base of thickness measurements. The integration of standard histological techniques (step serial sectioning), modern computer technology and a personally developed software package (specifically designed for thickness measurement) produced a novel technique suitable for the task. The technique allowed the elucidation of a larger data set from tissue samples. Thus a detailed and accurate ...

  17. Use of neural network techniques to identify cosmic ray electrons and positrons during the 1993 balloon flight of the NMSU/Wizard-TS93 instrument

    Energy Technology Data Exchange (ETDEWEB)

    Bellotti, R.; Castellano, M. [Bari Univ. (Italy)]|[INFN, Bari (Italy); Candusso, M.; Casolino, M.; Morselli, A.; Picozza, P. [Rome Univ. `Tor Vergata` (Italy)]|[INFN, Rome (Italy); Aversa, F.; Boezio, M. [Trieste Univ. (Italy)]|[INFN, Trieste (Italy); Barbiellini, G. [Trieste Univ. (Italy)]|[INFN, Trieste (Italy); Basini, G. [INFN, Laboratori Nazionali di Frascati, Rome (Italy)

    1995-09-01

    The detectors used in the TS93 balloon flight produced a large volume of information for each cosmic ray trigger. Some of the data was visual in nature, other portions contained energy deposition and timing information. The data sets are amenable to conventional analysis techniques but there is no assurance that conventional techniques make full use of subtle correlations and relations amongst the detector responses. With the advent of neural network technologies, particularly adept at classification of complex phenomena, it would seem appropriate to explore the utility of neural network techniques to classify particles observed with the instruments. In this paper neural network based methodology for signal/background discrimination in a cosmic ray space experiment is discussed. Results are presented for electron and positron classification in the TS93 flight data set and will be compared to conventional analyses.

  18. Connectomic analysis of brain networks: novel techniques and future directions

    Directory of Open Access Journals (Sweden)

    Leonie Cazemier

    2016-11-01

    Full Text Available Brain networks, localized or brain-wide, exist only at the cellular level, i.e. between specific pre- and postsynaptic neurons, which are connected through functionally diverse synapses located at specific points of their cell membranes. Connectomics is the emerging subfield of neuroanatomy explicitly aimed at elucidating the wiring of brain networks with cellular resolution and a quantified accuracy. Such data are indispensable for realistic modeling of brain circuitry and function. A connectomic analysis, therefore, needs to identify and measure the soma, dendrites, axonal path and branching patterns together with the synapses and gap junctions of the neurons involved in any given brain circuit or network. However, because of the submicron caliber, 3D complexity and high packing density of most such structures, as well as the fact that axons frequently extend over long distances to make synapses in remote brain regions, creating connectomic maps is technically challenging and requires multi-scale approaches, Such approaches involve the combination of the most sensitive cell labeling and analysis methods available, as well as the development of new ones able to resolve individual cells and synapses with increasing high-throughput. In this review, we provide an overview of recently introduced high-resolution methods, which researchers wanting to enter the field of connectomics may consider. It includes several molecular labeling tools, some of which specifically label synapses, and covers a number of novel imaging tools such as brain clearing protocols and microscopy approaches. Apart from describing the tools, we also provide an assessment of their qualities. The criteria we use assess the qualities that tools need in order to contribute to deciphering the key levels of circuit organization. We conclude with a brief future outlook for neuroanatomic research, computational methods and network modeling, where we also point out several outstanding

  19. Biomechanical analysis of cross-country skiing techniques.

    Science.gov (United States)

    Smith, G A

    1992-09-01

    The development of new techniques for cross-country skiing based on skating movements has stimulated biomechanical research aimed at understanding the various movement patterns, the forces driving the motions, and the mechanical factors affecting performance. Research methods have evolved from two-dimensional kinematic descriptions of classic ski techniques to three-dimensional analyses involving measurement of the forces and energy relations of skating. While numerous skiing projects have been completed, most have focused on either the diagonal stride or the V1 skating technique on uphill terrain. Current understanding of skiing mechanics is not sufficiently complete to adequately assess and optimize an individual skier's technique.

  20. COMPARATIVE ANALYSIS OF SATELLITE IMAGE PRE-PROCESSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Sree Sharmila

    2013-01-01

    Full Text Available Satellite images are corrupted by noise in its acquisition and transmission. The removal of noise from the image by attenuating the high frequency image components, removes some important details as well. In order to retain the useful information and improve the visual appearance, an effective denoising and resolution enhancement techniques are required. In this research, Hybrid Directional Lifting (HDL technique is proposed to retain the important details of the image and improve the visual appearance. The Discrete Wavelet Transform (DWT based interpolation technique is developed for enhancing the resolution of the denoised image. The performance of the proposed techniques are tested by Land Remote-Sensing Satellite (LANDSAT images, using the quantitative performance measure, Peak Signal to Noise Ratio (PSNR and computation time to show the significance of the proposed techniques. The PSNR of the HDL technique increases 1.02 dB compared to the standard denoising technique and the DWT based interpolation technique increases 3.94 dB. From the experimental results it reveals that newly developed image denoising and resolution enhancement techniques improve the image visual quality with rich textures.

  1. Practical In-Depth Analysis of IDS Alerts for Tracing and Identifying Potential Attackers on Darknet

    Directory of Open Access Journals (Sweden)

    Jungsuk Song

    2017-02-01

    Full Text Available The darknet (i.e., a set of unused IP addresses is a very useful solution for observing the global trends of cyber threats and analyzing attack activities on the Internet. Since the darknet is not connected with real systems, in most cases, the incoming packets on the darknet (‘the darknet traffic’ do not contain a payload. This means that we are unable to get real malware from the darknet traffic. This situation makes it difficult for security experts (e.g., academic researchers, engineers, operators, etc. to identify whether the source hosts of the darknet traffic are infected by real malware or not. In this paper, we present the overall procedure of the in-depth analysis between the darknet traffic and IDS alerts using real data collected at the Science and Technology Cyber Security Center (S&T CSC in Korea and provide the detailed in-depth analysis results. The ultimate goal of this paper is to provide practical experience, insight and know-how to security experts so that they are able to identify and trace the root cause of the darknet traffic. The experimental results show that correlation analysis between the darknet traffic and IDS alerts is very useful to discover potential attack hosts, especially internal hosts, and to find out what kinds of malware infected them.

  2. Comparative proteomic analysis of horseweed (Conyza canadensis) biotypes identifies candidate proteins for glyphosate resistance

    Science.gov (United States)

    González-Torralva, Fidel; Brown, Adrian P.; Chivasa, Stephen

    2017-01-01

    Emergence of glyphosate-resistant horseweed (Conyza canadensis) biotypes is an example of how unrelenting use of a single mode of action herbicide in agricultural weed control drives genetic adaptation in targeted species. While in other weeds glyphosate resistance arose from target site mutation or target gene amplification, the resistance mechanism in horseweed uses neither of these, being instead linked to reduced herbicide uptake and/or translocation. The molecular components underpinning horseweed glyphosate-resistance remain unknown. Here, we used an in vitro leaf disc system for comparative analysis of proteins extracted from control and glyphosate-treated tissues of glyphosate-resistant and glyphosate-susceptible biotypes. Analysis of shikimic acid accumulation, ABC-transporter gene expression, and cell death were used to select a suitable glyphosate concentration and sampling time for enriching proteins pivotal to glyphosate resistance. Protein gel analysis and mass spectrometry identified mainly chloroplast proteins differentially expressed between the biotypes before and after glyphosate treatment. Chloroplasts are the organelles in which the shikimate pathway, which is targeted by glyphosate, is located. Calvin cycle enzymes and proteins of unknown function were among the proteins identified. Our study provides candidate proteins that could be pivotal in engendering resistance and implicates chloroplasts as the primary sites driving glyphosate-resistance in horseweed. PMID:28198407

  3. Meta-CART: A tool to identify interactions between moderators in meta-analysis.

    Science.gov (United States)

    Li, Xinru; Dusseldorp, Elise; Meulman, Jacqueline J

    2017-02-01

    In the framework of meta-analysis, moderator analysis is usually performed only univariately. When several study characteristics are available that may account for treatment effect, standard meta-regression has difficulties in identifying interactions between them. To overcome this problem, meta-CART has been proposed: an approach that applies classification and regression trees (CART) to identify interactions, and then subgroup meta-analysis to test the significance of moderator effects. The previous version of meta-CART has its shortcomings: when applying CART, the sample sizes of studies are not taken into account, and the effect sizes are dichotomized around the median value. Therefore, this article proposes new meta-CART extensions, weighting study effect sizes by their accuracy, and using a regression tree to avoid dichotomization. In addition, new pruning rules are proposed. The performance of all versions of meta-CART was evaluated via a Monte Carlo simulation study. The simulation results revealed that meta-regression trees with random-effects weights and a 0.5-standard-error pruning rule perform best. The required sample size for meta-CART to achieve satisfactory performance depends on the number of study characteristics, the magnitude of the interactions, and the residual heterogeneity.

  4. Identifying patients with therapy-resistant depression by using factor analysis

    DEFF Research Database (Denmark)

    Andreasson, K; Liest, V; Lunde, M;

    2010-01-01

    INTRODUCTION: Attempts to identify the factor structure in patients with treatment-resistant depression have been very limited. METHODS: Principal component analysis was performed using the baseline datasets from 3 add-on studies [2 with repetitive transcranial magnetic stimulation and one...... with transcranial pulsed electromagnetic fields (T-PEMF)], in which the relative effect as percentage of improvement during the treatment period was analysed. RESULTS: We identified 2 major factors, the first of which was a general factor. The second was a dual factor consisting of a depression subscale comprising...... the negatively loaded items (covering the pure depression items) and a treatment resistant subscale comprising the positively loaded items (covering lassitude, concentration difficulties and sleep problems). These 2 dual subscales were used as outcome measures. Improvement on the treatment resistant subscale...

  5. Systematic analysis of public domain compound potency data identifies selective molecular scaffolds across druggable target families.

    Science.gov (United States)

    Hu, Ye; Wassermann, Anne Mai; Lounkine, Eugen; Bajorath, Jürgen

    2010-01-28

    Molecular scaffolds that yield target family-selective compounds are of high interest in pharmaceutical research. There continues to be considerable debate in the field as to whether chemotypes with a priori selectivity for given target families and/or targets exist and how they might be identified. What do currently available data tell us? We present a systematic and comprehensive selectivity-centric analysis of public domain target-ligand interactions. More than 200 molecular scaffolds are identified in currently available active compounds that are selective for established target families. A subset of these scaffolds is found to produce compounds with high selectivity for individual targets among closely related ones. These scaffolds are currently underrepresented in approved drugs.

  6. Identifying E-Business Model:A Value Chain-Based Analysis

    Institute of Scientific and Technical Information of China (English)

    ZENG Qingfeng; HUANG Lihua

    2004-01-01

    E-business will change the ways that all companies do business, and most traditional businesses will evolve from their current business model to a combination of place and space via e-business model To choose the proper e-business model becomes the important strategic concern for company to succeed The main objective of this paper is to investigate the analysis framework for identifying e-business model Based on the e-business process, from the value chain to the value net perspective. This paper provides a theoretical framework for identifying e-business models, and results in 11 e-business models. The strategic intend of every e-business model is discussed in the end of this paper. An enterprise e-business model design and implementation can be specified by the combination of one or more among 11 e-business models.

  7. Proteomic analysis of cell lines to identify the irinotecan resistance proteins

    Indian Academy of Sciences (India)

    Xing-Chen Peng; Feng-Ming Gong; Meng Wei; X I Chen; Y E Chen; K E Cheng; Feng Gao; Feng Xu; FENG Bi; Ji-Yan Liu

    2010-12-01

    Chemotherapeutic drug resistance is a frequent cause of treatment failure in colon cancer patients. Several mechanisms have been implicated in drug resistance. However, they are not sufficient to exhaustively account for this resistance emergence. In this study, two-dimensional gel electrophoresis (2-DE) and the PDQuest software analysis were applied to compare the differential expression of irinotecan-resistance-associated protein in human colon adenocarcinoma LoVo cells and irinotecan-resistant LoVo cells (LoVo/irinotecan). The differential protein dots were excised and analysed by ESI-Q-TOF mass spectrometry (MS). Fifteen proteins were identified, including eight proteins with decreased expression and seven proteins with increased expression. The identified known proteins included those that function in diverse biological processes such as cellular transcription, cell apoptosis, electron transport/redox regulation, cell proliferation/differentiation and retinol metabolism pathways. Identification of such proteins could allow improved understanding of the mechanisms leading to the acquisition of chemoresistance.

  8. Qualitative Comparative Analysis: A Hybrid Method for Identifying Factors Associated with Program Effectiveness.

    Science.gov (United States)

    Cragun, Deborah; Pal, Tuya; Vadaparampil, Susan T; Baldwin, Julie; Hampel, Heather; DeBate, Rita D

    2016-07-01

    Qualitative comparative analysis (QCA) was developed over 25 years ago to bridge the qualitative and quantitative research gap. Upon searching PubMed and the Journal of Mixed Methods Research, this review identified 30 original research studies that utilized QCA. Perceptions that QCA is complex and provides few relative advantages over other methods may be limiting QCA adoption. Thus, to overcome these perceptions, this article demonstrates how to perform QCA using data from fifteen institutions that implemented universal tumor screening (UTS) programs to identify patients at high risk for hereditary colorectal cancer. In this example, QCA revealed a combination of conditions unique to effective UTS programs. Results informed additional research and provided a model for improving patient follow-through after a positive screen.

  9. Gastric Cancer Associated Genes Identified by an Integrative Analysis of Gene Expression Data

    Science.gov (United States)

    Jiang, Bing; Li, Shuwen; Jiang, Zhi

    2017-01-01

    Gastric cancer is one of the most severe complex diseases with high morbidity and mortality in the world. The molecular mechanisms and risk factors for this disease are still not clear since the cancer heterogeneity caused by different genetic and environmental factors. With more and more expression data accumulated nowadays, we can perform integrative analysis for these data to understand the complexity of gastric cancer and to identify consensus players for the heterogeneous cancer. In the present work, we screened the published gene expression data and analyzed them with integrative tool, combined with pathway and gene ontology enrichment investigation. We identified several consensus differentially expressed genes and these genes were further confirmed with literature mining; at last, two genes, that is, immunoglobulin J chain and C-X-C motif chemokine ligand 17, were screened as novel gastric cancer associated genes. Experimental validation is proposed to further confirm this finding. PMID:28232943

  10. Energy dispersive X-ray diffraction to identify explosive substances: Spectra analysis procedure optimization

    Energy Technology Data Exchange (ETDEWEB)

    Crespy, C., E-mail: charles.crespy@insa-lyon.f [CNDRI-Insa Lyon, Universite de Lyon, F-69621, Villeurbanne cedex (France); Duvauchelle, P., E-mail: philippe.duvauchelle@insa-lyon.f [CNDRI-Insa Lyon, Universite de Lyon, F-69621, Villeurbanne cedex (France); Kaftandjian, V.; Soulez, F. [CNDRI-Insa Lyon, Universite de Lyon, F-69621, Villeurbanne cedex (France); Ponard, P. [Thales Components and Subsystems, 2 rue Marcel Dassault 78491, Velizy cedex (France)

    2010-11-21

    To detect the presence of explosives in packages, automated systems are required. Energy dispersive X-ray diffraction (EDXRD) represents a powerful non-invasive tool providing information on the atomic structure of samples. In this paper, EDXRD is investigated as a suitable technique for explosive detection and identification. To this end, a database has been constructed, containing measured X-ray diffraction spectra of several explosives and common materials. In order to quantify spectral resolution influence, this procedure is repeated with two different detectors which have different spectral resolution. Using our database, some standard spectrum analysis procedures generally used for this application have been implemented. Regarding to the results, it is possible to conclude on the robustness and the limits of each analysis procedure. The aim of this work is to define a robust and efficient sequence of EDXRD spectra analysis to discriminate explosive substances. Since our explosive substances are crystalline, the first step consists in using characteristic of the spectrum to estimate a crystallinity criterion which allows to remove a large part of common materials. The second step is a more detailed analysis, it consists in using similarity criterion and major peaks location to differentiate explosive from crystalline common materials. The influence of the spectral resolution on the detection is also examined.

  11. Gene-network analysis identifies susceptibility genes related to glycobiology in autism.

    Directory of Open Access Journals (Sweden)

    Bert van der Zwaag

    Full Text Available The recent identification of copy-number variation in the human genome has opened up new avenues for the discovery of positional candidate genes underlying complex genetic disorders, especially in the field of psychiatric disease. One major challenge that remains is pinpointing the susceptibility genes in the multitude of disease-associated loci. This challenge may be tackled by reconstruction of functional gene-networks from the genes residing in these loci. We applied this approach to autism spectrum disorder (ASD, and identified the copy-number changes in the DNA of 105 ASD patients and 267 healthy individuals with Illumina Humanhap300 Beadchips. Subsequently, we used a human reconstructed gene-network, Prioritizer, to rank candidate genes in the segmental gains and losses in our autism cohort. This analysis highlighted several candidate genes already known to be mutated in cognitive and neuropsychiatric disorders, including RAI1, BRD1, and LARGE. In addition, the LARGE gene was part of a sub-network of seven genes functioning in glycobiology, present in seven copy-number changes specifically identified in autism patients with limited co-morbidity. Three of these seven copy-number changes were de novo in the patients. In autism patients with a complex phenotype and healthy controls no such sub-network was identified. An independent systematic analysis of 13 published autism susceptibility loci supports the involvement of genes related to glycobiology as we also identified the same or similar genes from those loci. Our findings suggest that the occurrence of genomic gains and losses of genes associated with glycobiology are important contributors to the development of ASD.

  12. Finite Element Creep-Fatigue Analysis of a Welded Furnace Roll for Identifying Failure Root Cause

    Science.gov (United States)

    Yang, Y. P.; Mohr, W. C.

    2015-11-01

    Creep-fatigue induced failures are often observed in engineering components operating under high temperature and cyclic loading. Understanding the creep-fatigue damage process and identifying failure root cause are very important for preventing such failures and improving the lifetime of engineering components. Finite element analyses including a heat transfer analysis and a creep-fatigue analysis were conducted to model the cyclic thermal and mechanical process of a furnace roll in a continuous hot-dip coating line. Typically, the roll has a short life, heat transfer analysis was conducted to predict the temperature history of the roll by modeling heat convection from hot air inside the furnace. The creep-fatigue analysis was performed by inputting the predicted temperature history and applying mechanical loads. The analysis results showed that the failure was resulted from a creep-fatigue mechanism rather than a creep mechanism. The difference of material properties between the filler metal and the base metal is the root cause for the roll failure, which induces higher creep strain and stress in the interface between the weld and the HAZ.

  13. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  14. Low level radioactivity measurements with phoswich detectors using coincident techniques and digital pulse processing analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fuente, R. de la [University of Leon, Escuela de Ingenieria Industrial, Leon 24071 (Spain); Celis, B. de [University of Leon, Escuela de Ingenieria Industrial, Leon 24071 (Spain)], E-mail: bcelc@unileon.es; Canto, V. del; Lumbreras, J.M. [University of Leon, Escuela de Ingenieria Industrial, Leon 24071 (Spain); Celis, Alonso B. de [King' s College London, IoP, De Crespigny Park, SE58AF (United Kingdom); Martin-Martin, A. [Laboratorio LIBRA, Edificio I-D, Paseo Belen 3. 47011 Valladolid (Spain); Departamento de Fisica Teorica, Atomica y Optica, Facultad de Ciencias. Po Prado de la Magdalena, s/n. 47005 Valladolid (Spain)], E-mail: alonsomm@libra.uva.es; Gutierrez-Villanueva, J.L. [Laboratorio LIBRA, Edificio I-D, Paseo Belen 3. 47011 Valladolid (Spain); Departamento de Fisica Teorica, Atomica y Optica, Facultad de Ciencias. Po Prado de la Magdalena, s/n. 47005 Valladolid (Spain)], E-mail: joselg@libra.uva.es

    2008-10-15

    A new system has been developed for the detection of low radioactivity levels of fission products and actinides using coincidence techniques. The device combines a phoswich detector for {alpha}/{beta}/{gamma}-ray recognition with a fast digital card for electronic pulse analysis. The phoswich can be used in a coincident mode by identifying the composed signal produced by the simultaneous detection of {alpha}/{beta} particles and X-rays/{gamma} particles. The technique of coincidences with phoswich detectors was proposed recently to verify the Nuclear Test Ban Treaty (NTBT) which established the necessity of monitoring low levels of gaseous fission products produced by underground nuclear explosions. With the device proposed here it is possible to identify the coincidence events and determine the energy and type of coincident particles. The sensitivity of the system has been improved by employing liquid scintillators and a high resolution low energy germanium detector. In this case it is possible to identify simultaneously by {alpha}/{gamma} coincidence transuranic nuclides present in environmental samples without necessity of performing radiochemical separation. The minimum detectable activity was estimated to be 0.01 Bq kg{sup -1} for 0.1 kg of soil and 1000 min counting.

  15. Systematic enrichment analysis of gene expression profiling studies identifies consensus pathways implicated in colorectal cancer development

    Directory of Open Access Journals (Sweden)

    Jesús Lascorz

    2011-01-01

    Full Text Available Background: A large number of gene expression profiling (GEP studies on colorectal carcinogenesis have been performed but no reliable gene signature has been identified so far due to the lack of reproducibility in the reported genes. There is growing evidence that functionally related genes, rather than individual genes, contribute to the etiology of complex traits. We used, as a novel approach, pathway enrichment tools to define functionally related genes that are consistently up- or down-regulated in colorectal carcinogenesis. Materials and Methods: We started the analysis with 242 unique annotated genes that had been reported by any of three recent meta-analyses covering GEP studies on genes differentially expressed in carcinoma vs normal mucosa. Most of these genes (218, 91.9% had been reported in at least three GEP studies. These 242 genes were submitted to bioinformatic analysis using a total of nine tools to detect enrichment of Gene Ontology (GO categories or Kyoto Encyclopedia of Genes and Genomes (KEGG pathways. As a final consistency criterion the pathway categories had to be enriched by several tools to be taken into consideration. Results: Our pathway-based enrichment analysis identified the categories of ribosomal protein constituents, extracellular matrix receptor interaction, carbonic anhydrase isozymes, and a general category related to inflammation and cellular response as significantly and consistently overrepresented entities. Conclusions: We triaged the genes covered by the published GEP literature on colorectal carcinogenesis and subjected them to multiple enrichment tools in order to identify the consistently enriched gene categories. These turned out to have known functional relationships to cancer development and thus deserve further investigation.

  16. A sequence-based approach to identify reference genes for gene expression analysis

    Directory of Open Access Journals (Sweden)

    Chari Raj

    2010-08-01

    Full Text Available Abstract Background An important consideration when analyzing both microarray and quantitative PCR expression data is the selection of appropriate genes as endogenous controls or reference genes. This step is especially critical when identifying genes differentially expressed between datasets. Moreover, reference genes suitable in one context (e.g. lung cancer may not be suitable in another (e.g. breast cancer. Currently, the main approach to identify reference genes involves the mining of expression microarray data for highly expressed and relatively constant transcripts across a sample set. A caveat here is the requirement for transcript normalization prior to analysis, and measurements obtained are relative, not absolute. Alternatively, as sequencing-based technologies provide digital quantitative output, absolute quantification ensues, and reference gene identification becomes more accurate. Methods Serial analysis of gene expression (SAGE profiles of non-malignant and malignant lung samples were compared using a permutation test to identify the most stably expressed genes across all samples. Subsequently, the specificity of the reference genes was evaluated across multiple tissue types, their constancy of expression was assessed using quantitative RT-PCR (qPCR, and their impact on differential expression analysis of microarray data was evaluated. Results We show that (i conventional references genes such as ACTB and GAPDH are highly variable between cancerous and non-cancerous samples, (ii reference genes identified for lung cancer do not perform well for other cancer types (breast and brain, (iii reference genes identified through SAGE show low variability using qPCR in a different cohort of samples, and (iv normalization of a lung cancer gene expression microarray dataset with or without our reference genes, yields different results for differential gene expression and subsequent analyses. Specifically, key established pathways in lung

  17. Cluster analysis for identifying sub-groups and selecting potential discriminatory variables in human encephalitis

    Directory of Open Access Journals (Sweden)

    Crowcroft Natasha S

    2010-12-01

    Full Text Available Abstract Background Encephalitis is an acute clinical syndrome of the central nervous system (CNS, often associated with fatal outcome or permanent damage, including cognitive and behavioural impairment, affective disorders and epileptic seizures. Infection of the central nervous system is considered to be a major cause of encephalitis and more than 100 different pathogens have been recognized as causative agents. However, a large proportion of cases have unknown disease etiology. Methods We perform hierarchical cluster analysis on a multicenter England encephalitis data set with the aim of identifying sub-groups in human encephalitis. We use the simple matching similarity measure which is appropriate for binary data sets and performed variable selection using cluster heatmaps. We also use heatmaps to visually assess underlying patterns in the data, identify the main clinical and laboratory features and identify potential risk factors associated with encephalitis. Results Our results identified fever, personality and behavioural change, headache and lethargy as the main characteristics of encephalitis. Diagnostic variables such as brain scan and measurements from cerebrospinal fluids are also identified as main indicators of encephalitis. Our analysis revealed six major clusters in the England encephalitis data set. However, marked within-cluster heterogeneity is observed in some of the big clusters indicating possible sub-groups. Overall, the results show that patients are clustered according to symptom and diagnostic variables rather than causal agents. Exposure variables such as recent infection, sick person contact and animal contact have been identified as potential risk factors. Conclusions It is in general assumed and is a common practice to group encephalitis cases according to disease etiology. However, our results indicate that patients are clustered with respect to mainly symptom and diagnostic variables rather than causal agents

  18. Factor Analysis of the DePaul Symptom Questionnaire: Identifying Core Domains.

    Science.gov (United States)

    Jason, Leonard A; Sunnquist, Madison; Brown, Abigail; Furst, Jacob; Cid, Marjoe; Farietta, Jillianna; Kot, Bobby; Bloomer, Craig; Nicholson, Laura; Williams, Yolonda; Jantke, Rachel; Newton, Julia L; Strand, Elin Bolle

    2015-09-01

    The present study attempted to identify critical symptom domains of individuals with Myalgic Encephalomyelitis (ME) and chronic fatigue syndrome (CFS). Using patient and control samples collected in the United States, Great Britain, and Norway, exploratory factor analysis (EFA) was used to establish the underlying factor structure of ME and CFS symptoms. The EFA suggested a four-factor solution: post-exertional malaise, cognitive dysfunction, sleep difficulties, and a combined factor consisting of neuroendocrine, autonomic, and immune dysfunction symptoms. The use of empirical methods could help better understand the fundamental symptom domains of this illness.

  19. Error Analysis for the Airborne Direct Georeferincing Technique

    Science.gov (United States)

    Elsharkawy, Ahmed S.; Habib, Ayman F.

    2016-10-01

    Direct Georeferencing was shown to be an important alternative to standard indirect image orientation using classical or GPS-supported aerial triangulation. Since direct Georeferencing without ground control relies on an extrapolation process only, particular focus has to be laid on the overall system calibration procedure. The accuracy performance of integrated GPS/inertial systems for direct Georeferencing in airborne photogrammetric environments has been tested extensively in the last years. In this approach, the limiting factor is a correct overall system calibration including the GPS/inertial component as well as the imaging sensor itself. Therefore remaining errors in the system calibration will significantly decrease the quality of object point determination. This research paper presents an error analysis for the airborne direct Georeferencing technique, where integrated GPS/IMU positioning and navigation systems are used, in conjunction with aerial cameras for airborne mapping compared with GPS/INS supported AT through the implementation of certain amount of error on the EOP and Boresight parameters and study the effect of these errors on the final ground coordinates. The data set is a block of images consists of 32 images distributed over six flight lines, the interior orientation parameters, IOP, are known through careful camera calibration procedure, also 37 ground control points are known through terrestrial surveying procedure. The exact location of camera station at time of exposure, exterior orientation parameters, EOP, is known through GPS/INS integration process. The preliminary results show that firstly, the DG and GPS-supported AT have similar accuracy and comparing with the conventional aerial photography method, the two technologies reduces the dependence on ground control (used only for quality control purposes). Secondly, In the DG Correcting overall system calibration including the GPS/inertial component as well as the imaging sensor itself

  20. An Analysis of Nondestructive Evaluation Techniques for Polymer Matrix Composite Sandwich Materials

    Science.gov (United States)

    Cosgriff, Laura M.; Roberts, Gary D.; Binienda, Wieslaw K.; Zheng, Diahua; Averbeck, Timothy; Roth, Donald J.; Jeanneau, Philippe

    2006-01-01

    Structural sandwich materials composed of triaxially braided polymer matrix composite material face sheets sandwiching a foam core are being utilized for applications including aerospace components and recreational equipment. Since full scale components are being made from these sandwich materials, it is necessary to develop proper inspection practices for their manufacture and in-field use. Specifically, nondestructive evaluation (NDE) techniques need to be investigated for analysis of components made from these materials. Hockey blades made from sandwich materials and a flat sandwich sample were examined with multiple NDE techniques including thermographic, radiographic, and shearographic methods to investigate damage induced in the blades and flat panel components. Hockey blades used during actual play and a flat polymer matrix composite sandwich sample with damage inserted into the foam core were investigated with each technique. NDE images from the samples were presented and discussed. Structural elements within each blade were observed with radiographic imaging. Damaged regions and some structural elements of the hockey blades were identified with thermographic imaging. Structural elements, damaged regions, and other material variations were detected in the hockey blades with shearography. Each technique s advantages and disadvantages were considered in making recommendations for inspection of components made from these types of materials.

  1. Advanced patch-clamp techniques and single-channel analysis

    NARCIS (Netherlands)

    Biskup, B; Elzenga, JTM; Homann, U; Thiel, G; Wissing, F; Maathuis, FJM

    1999-01-01

    Much of our knowledge of ion-transport mechanisms in plant cell membranes comes from experiments using voltage-clamp. This technique allows the measurement of ionic currents across the membrane, whilst the voltage is held under experimental control. The patch-clamp technique was developed to study t

  2. Reliability and Sensitivity Analysis of Transonic Flutter Using Improved Line Sampling Technique

    Institute of Scientific and Technical Information of China (English)

    Song Shufang; Lu Zhenzhou; Zhang Weiwei; Ye Zhengyin

    2009-01-01

    The improved line sampling (LS) technique, an effective numerical simulation method, is employed to analyze the probabilistic characteristics and reliability sensitivity of flutter with random structural parameter in transonic flow. The improved LS technique is a novel methodology for reliability and sensitivity analysis of high dimensionality and low probability problem with implicit limit state function, and it does not require any approximating surrogate of the implicit limit state equation. The improved LS is used to estimate the flutter reliability and the sensitivity of a two-dimensional wing, in which some structural properties, such as frequency, parameters of gravity center and mass ratio, are considered as random variables. Computational fluid dynamics (CFD) based unsteady aerodynamic reduced order model (ROM) method is used to construct the aerodynamic state equations. Coupling structural state equations with aerodynamic state equations, the safety margin of flutter is founded by using the critical velocity of flutter. The results show that the improved LS technique can effectively decrease the computational cost in the random uncertainty analysis of flutter. The reliability sensitivity, defined by the partial derivative of the failure probability with respect to the distribution parameter of random variable, can help to identify the important parameters and guide the structural optimization design.

  3. Noise Reduction Analysis of Radar Rainfall Using Chaotic Dynamics and Filtering Techniques

    Directory of Open Access Journals (Sweden)

    Soojun Kim

    2014-01-01

    Full Text Available The aim of this study is to evaluate the filtering techniques which can remove the noise involved in the time series. For this, Logistic series which is chaotic series and radar rainfall series are used for the evaluation of low-pass filter (LF and Kalman filter (KF. The noise is added to Logistic series by considering noise level and the noise added series is filtered by LF and KF for the noise reduction. The analysis for the evaluation of LF and KF techniques is performed by the correlation coefficient, standard error, the attractor, and the BDS statistic from chaos theory. The analysis result for Logistic series clearly showed that KF is better tool than LF for removing the noise. Also, we used the radar rainfall series for evaluating the noise reduction capabilities of LF and KF. In this case, it was difficult to distinguish which filtering technique is better way for noise reduction when the typical statistics such as correlation coefficient and standard error were used. However, when the attractor and the BDS statistic were used for evaluating LF and KF, we could clearly identify that KF is better than LF.

  4. Data management and data analysis techniques in pharmacoepidemiological studies using a pre-planned multi-database approach

    DEFF Research Database (Denmark)

    Bazelier, Marloes T; Eriksson, Irene; de Vries, Frank

    2015-01-01

    -database studies are a well-powered strategy to address safety issues and have increased in popularity. To be able to correctly interpret the results of these studies, it is important to systematically report on database management and analysis techniques, including central programming and heterogeneity testing.......PURPOSE: To identify pharmacoepidemiological multi-database studies and to describe data management and data analysis techniques used for combining data. METHODS: Systematic literature searches were conducted in PubMed and Embase complemented by a manual literature search. We included...

  5. Whole Genome Analysis of Injectional Anthrax Identifies Two Disease Clusters Spanning More Than 13 Years

    Directory of Open Access Journals (Sweden)

    Paul Keim

    2015-11-01

    Lay Person Interpretation: Injectional anthrax has been plaguing heroin drug users across Europe for more than 10 years. In order to better understand this outbreak, we assessed genomic relationships of all available injectional anthrax strains from four countries spanning a >12 year period. Very few differences were identified using genome-based analysis, but these differentiated the isolates into two distinct clusters. This strongly supports a hypothesis of at least two separate anthrax spore contamination events perhaps during the drug production processes. Identification of two events would not have been possible from standard epidemiological analysis. These comprehensive data will be invaluable for classifying future injectional anthrax isolates and for future geographic attribution.

  6. Identifying Chemistry Prospective Teachers' Difficulties Encountered in Practice of The Subject Area Textbook Analysis Course

    Directory of Open Access Journals (Sweden)

    Zeynep Bak Kibar

    2010-12-01

    Full Text Available Prospective teachers should already be aware of possible mistakes in the textbooks and have knowledge of textbooks selection procedure and criteria. These knowledge is tried to being gained to prospective teachers at the Subject Area Textbook Analysis Course. It is important to identify the difficulties they encountered and the skills they gained from the point of implementing effectively this lesson. To research these problems, a case study was realized with 38 student teachers from Department of Secondary Science and Mathematics Education Chemistry Teaching Program at the Karadeniz Technical University Faculty of Fatih Education. Results suggest that prospective teachers gained the knowledge of research, teaching life, writing report, and analyzing textbook. Also, it was determined that they had difficulties in group working, literature reviewing, report writing, analyzing textbook, and critical analysis.

  7. Gene expression meta-analysis identifies metastatic pathways and transcription factors in breast cancer

    DEFF Research Database (Denmark)

    Thomassen, Mads; Tan, Qihua; Kruse, Torben

    2008-01-01

    tumors compared to non-metastasizing tumors. Meta-analysis has been used to determine overrepresentation of pathways and transcription factors targets, concordant deregulated in metastasizing breast tumors, in several data sets. RESULTS: The major findings are upregulation of cell cycle pathways...... system, angiogenesis, DNA repair and several signal transduction pathways are associated to metastasis. Finally several transcription factors e.g. E2F, NFY, and YY1 are identified as being involved in metastasis. CONCLUSIONS: By pathway meta-analysis many biological mechanisms beyond major...... studies. Besides classification of outcome, these global expression patterns may reflect biological mechanisms involved in metastasis of breast cancer. Our purpose has been to investigate pathways and transcription factors involved in metastasis by use of gene expression data sets. METHODS: We have...

  8. Towards a typology of business process management professionals: identifying patterns of competences through latent semantic analysis

    Science.gov (United States)

    Müller, Oliver; Schmiedel, Theresa; Gorbacheva, Elena; vom Brocke, Jan

    2016-01-01

    While researchers have analysed the organisational competences that are required for successful Business Process Management (BPM) initiatives, individual BPM competences have not yet been studied in detail. In this study, latent semantic analysis is used to examine a collection of 1507 BPM-related job advertisements in order to develop a typology of BPM professionals. This empirical analysis reveals distinct ideal types and profiles of BPM professionals on several levels of abstraction. A closer look at these ideal types and profiles confirms that BPM is a boundary-spanning field that requires interdisciplinary sets of competence that range from technical competences to business and systems competences. Based on the study's findings, it is posited that individual and organisational alignment with the identified ideal types and profiles is likely to result in high employability and organisational BPM success.

  9. Analysis of the Quintilii’s Villa Bronzes by Spectroscopy Techniques

    Directory of Open Access Journals (Sweden)

    Fabio Stranges

    2014-01-01

    Full Text Available The aim of this work is the characterization, with different diagnostic tests, of three fragments of bronze artefacts recovered from the Villa of the Quintilii (located in the south of Rome. In particular, the sample alloys were investigated by different chemical and morphological analysis. Firstly, an analysis of the alloy, implemented through the electronic spectroscopy, was taken to discriminate the bronze morphology and its elemental composition. Subsequently, a surface analysis was realized by molecular spectroscopy to identify the alteration patinas on surfaces (such as bronze disease. Two diagnostic techniques are used for the alloy analysis: scanning electron microscopy (SEM connected to the EDX spectroscopy (to study the morphology and alloy composition and Auger electron spectroscopy (AES (to identify the oxidation state of each element. Moreover, for the study of surface patinas, IR and Raman spectroscopies were implemented. All studies were performed on the “as received” samples, covered by a thin layer of excavated soil and on samples processed in an aqueous solution of sulphuric acid (10%, to remove patinas and alterations.

  10. Identifying past fire regimes throughout the Holocene in Ireland using new and established methods of charcoal analysis

    Science.gov (United States)

    Hawthorne, Donna; Mitchell, Fraser J. G.

    2016-04-01

    Globally, in recent years there has been an increase in the scale, intensity and level of destruction caused by wildfires. This can be seen in Ireland where significant changes in vegetation, land use, agriculture and policy, have promoted an increase in fires in the Irish landscape. This study looks at wildfire throughout the Holocene and draws on lacustrine charcoal records from seven study sites spread across Ireland, to reconstruct the past fire regimes recorded at each site. This work utilises new and accepted methods of fire history reconstruction to provide a recommended analytical procedure for statistical charcoal analysis. Digital charcoal counting was used and fire regime reconstructions carried out via the CharAnalysis programme. To verify this record new techniques are employed; an Ensemble-Member strategy to remove the objectivity associated with parameter selection, a Signal to Noise Index to determine if the charcoal record is appropriate for peak detection, and a charcoal peak screening procedure to validate the identified fire events based on bootstrapped samples. This analysis represents the first study of its kind in Ireland, examining the past record of fire on a multi-site and paleoecological timescale, and will provide a baseline level of data which can be built on in the future when the frequency and intensity of fire is predicted to increase.

  11. The use of environmental monitoring as a technique to identify isotopic enrichment activities; O uso da monitoracao ambiental como tecnica de identificacao de atividades de enriquecimento isotopico

    Energy Technology Data Exchange (ETDEWEB)

    Buchmann, Jose Henrique

    2000-07-01

    The use of environmental monitoring as a technique to identify activities related to the nuclear fuel cycle has been proposed, by international organizations, as an additional measure to the safeguards agreements in force. The elements specific for each kind of nuclear activity, or nuclear signatures, inserted in the ecosystem by several transfer paths, can be intercepted with better or worse ability by different live organisms. Depending on the kind of signature of interest, the anthropogenic material identification and quantification require the choice of adequate biologic indicators and, mainly, the use of sophisticated techniques associated with elaborate sample treatments. This work demonstrates the technical viability of using pine needles as bioindicators of nuclear signatures associated with uranium enrichment activities. Additionally, it proposes the use of a technique widely diffused nowadays in the scientific community, the High Resolution Inductively Coupled Plasma Mass Spectrometer (HR-ICP-MS), to identify the signature corresponding to that kind of activities in the ecosystem. It can be also found a description of a methodology recently being applied in analytical chemistry,based on uncertainties estimates metrological concepts, used to calculate the uncertainties associated with the obtained measurement results. Nitric acid solutions with a concentration of 0.3 mol.kg{sup -1}, used to wash pine needles sampled near facilities that manipulate enriched uranium and containing only 0.1 {mu}g.kg{sup -1} of uranium, exhibit a {sup 235} U: {sup 238} U isotopic abundance ratio of 0.0092{+-}0.0002, while solutions originated from samples collected at places located more than 200 km far from activities related to the nuclear fuel cycle exhibit a value of 0.0074{+-}0.0002 for this abundance ratio. Similar results were obtained for samples collected in different places permit to confirm the presence of anthropogenic uranium and demonstrate the viability of using

  12. A fractal analysis of skin pigmented lesions using the novel tool of the variogram technique

    Energy Technology Data Exchange (ETDEWEB)

    Mastrolonardo, Mario [Department of Medical and Occupational Sciences, Unit of Dermatology, Azienda Ospedaliero-Universitaria ' Ospedali Riuniti' di Foggia (Italy)]. E-mail: mariomastrolonardo@libero.it; Conte, Elio [Department of Medical and Occupational Sciences, Unit of Dermatology, Azienda Ospedaliero-Universitaria ' Ospedali Riuniti' di Foggia (Italy); Department of Pharmacology and Human Physiology, TIRES-Center for Innovative Technology for Signal Detection and Processing, Bari University, 70100 Bari (Italy); Zbilut, Joseph P. [Department of Molecular Biophysics and Physiology, Rush University, Chicago, IL 60612 (United States)

    2006-06-15

    The incidence of the cutaneous malignant melanoma is increasing rapidly in the world [Ferlay J, Bray F, Pisani P, et al. GLOBOCAN 2000: Cancer incidence, mortality and prevalence worldwide, Version 1.0 IARC Cancer Base no. 5. Lyon: IARC Press, 2001]. The therapeutic address requires a method having high sensitivity and capability to diagnose such disease at an early stage. We introduce a new diagnostic method based on non-linear methodologies. In detail we suggest that fractal as well as noise and chaos dynamics are the most important components responsible for genetic instability of melanocytes. As consequence we introduce the new technique of the variogram and of fractal analysis extended to the whole regions of interest of skin in order to obtain parameters able to identify the malignant lesion. In a preliminary analysis, satisfactory results are reached.

  13. Multidimensional scaling technique for analysis of magnetic storms at Indian observatories

    Indian Academy of Sciences (India)

    M Sridharan; A M S Ramasamy

    2002-12-01

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  14. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Supersonic Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2012-01-01

    the CFD load to be able to be readily applied, along with analytical and experimental variations in both the temporal and spatial fourier components of the excitation. In addition, this model is a first step in identifying response differences between transient and frequency forced response analysis techniques. The second phase assesses this difference for a much more realistic solid model of a bladed-disk in order to evaluate the effect of the spatial variation in loading on blade dominated modes. Neither research on the accuracy of the frequency response method when used in this context or a comprehensive study of the effect of test-observed variation on blade forced response have been found in the literature, so this research is a new contribution to practical structural dynamic analysis of gas turbines. The primary excitation of the upstream nozzles interacts with the blades on fuel pump of the J2X causes the 5th Nodal diameter modes to be excited, as explained by Tyler and Sofrin1, so a modal analysis was first performed on the beam/plate model and the 5ND bladed-disk mode at 40167 hz was identified and chosen to be the one excited at resonance (see figure 1). The first forced response analysis with this model focuses on identifying differences between frequency and transient response analyses. A hypothesis going into the analysis was that perhaps the frequency response was enforcing a temporal periodicity that did not really exist, and so therefore it would overestimate the response. As high dynamic response was a considerable source of stress in the J2X, examining this concept could potentially be beneficial for the program.

  15. Multicomponent statistical analysis to identify flow and transport processes in a highly-complex environment

    Science.gov (United States)

    Moeck, Christian; Radny, Dirk; Borer, Paul; Rothardt, Judith; Auckenthaler, Adrian; Berg, Michael; Schirmer, Mario

    2016-11-01

    A combined approach of multivariate statistical analysis, namely factor analysis (FA) and hierarchical cluster analysis (HCA), interpretation of geochemical processes, stable water isotope data and organic micropollutants enabling to assess spatial patterns of water types was performed for a study area in Switzerland, where drinking water production is close to different potential input pathways for contamination. To avoid drinking water contamination, artificial groundwater recharge with surface water into an aquifer is used to create a hydraulic barrier between potential intake pathways for contamination and drinking water extraction wells. Inter-aquifer mixing in the subsurface is identified, where a high amount of artificial infiltrated surface water is mixed with a lesser amount of water originating from the regional flow pathway in the vicinity of drinking water extraction wells. The spatial distribution of different water types can be estimated and a conceptual system understanding is developed. Results of the multivariate statistical analysis are comparable with gained information from isotopic data and organic micropollutants analyses. The integrated approach using different kinds of observations can be easily transferred to a variety of hydrological settings to synthesise and evaluate large hydrochemical datasets. The combination of additional data with different information content is conceivable and enabled effective interpretation of hydrological processes. Using the applied approach leads to more sound conceptual system understanding acting as the very basis to develop improved water resources management practices in a sustainable way.

  16. Identifying changes in the support networks of end-of-life carers using social network analysis.

    Science.gov (United States)

    Leonard, Rosemary; Horsfall, Debbie; Noonan, Kerrie

    2015-06-01

    End-of-life caring is often associated with reduced social networks for both the dying person and for the carer. However, those adopting a community participation and development approach, see the potential for the expansion and strengthening of networks. This paper uses Knox, Savage and Harvey's definitions of three generations social network analysis to analyse the caring networks of people with a terminal illness who are being cared for at home and identifies changes in these caring networks that occurred over the period of caring. Participatory network mapping of initial and current networks was used in nine focus groups. The analysis used key concepts from social network analysis (size, density, transitivity, betweenness and local clustering) together with qualitative analyses of the group's reflections on the maps. The results showed an increase in the size of the networks and that ties between the original members of the network strengthened. The qualitative data revealed the importance between core and peripheral network members and the diverse contributions of the network members. The research supports the value of third generation social network analysis and the potential for end-of-life caring to build social capital.

  17. Comprehensive genomic analysis of malignant pleural mesothelioma identifies recurrent mutations, gene fusions and splicing alterations.

    Science.gov (United States)

    Bueno, Raphael; Stawiski, Eric W; Goldstein, Leonard D; Durinck, Steffen; De Rienzo, Assunta; Modrusan, Zora; Gnad, Florian; Nguyen, Thong T; Jaiswal, Bijay S; Chirieac, Lucian R; Sciaranghella, Daniele; Dao, Nhien; Gustafson, Corinne E; Munir, Kiara J; Hackney, Jason A; Chaudhuri, Amitabha; Gupta, Ravi; Guillory, Joseph; Toy, Karen; Ha, Connie; Chen, Ying-Jiun; Stinson, Jeremy; Chaudhuri, Subhra; Zhang, Na; Wu, Thomas D; Sugarbaker, David J; de Sauvage, Frederic J; Richards, William G; Seshagiri, Somasekar

    2016-04-01

    We analyzed transcriptomes (n = 211), whole exomes (n = 99) and targeted exomes (n = 103) from 216 malignant pleural mesothelioma (MPM) tumors. Using RNA-seq data, we identified four distinct molecular subtypes: sarcomatoid, epithelioid, biphasic-epithelioid (biphasic-E) and biphasic-sarcomatoid (biphasic-S). Through exome analysis, we found BAP1, NF2, TP53, SETD2, DDX3X, ULK2, RYR2, CFAP45, SETDB1 and DDX51 to be significantly mutated (q-score ≥ 0.8) in MPMs. We identified recurrent mutations in several genes, including SF3B1 (∼2%; 4/216) and TRAF7 (∼2%; 5/216). SF3B1-mutant samples showed a splicing profile distinct from that of wild-type tumors. TRAF7 alterations occurred primarily in the WD40 domain and were, except in one case, mutually exclusive with NF2 alterations. We found recurrent gene fusions and splice alterations to be frequent mechanisms for inactivation of NF2, BAP1 and SETD2. Through integrated analyses, we identified alterations in Hippo, mTOR, histone methylation, RNA helicase and p53 signaling pathways in MPMs.

  18. Gene network analysis in a pediatric cohort identifies novel lung function genes.

    Directory of Open Access Journals (Sweden)

    Bruce A Ong

    Full Text Available Lung function is a heritable trait and serves as an important clinical predictor of morbidity and mortality for pulmonary conditions in adults, however, despite its importance, no studies have focused on uncovering pediatric-specific loci influencing lung function. To identify novel genetic determinants of pediatric lung function, we conducted a genome-wide association study (GWAS of four pulmonary function traits, including FVC, FEV1, FEV1/FVC and FEF25-75% in 1556 children. Further, we carried out gene network analyses for each trait including all SNPs with a P-value of <1.0 × 10(-3 from the individual GWAS. The GWAS identified SNPs with notable trends towards association with the pulmonary function measures, including the previously described INTS12 locus association with FEV1 (pmeta=1.41 × 10(-7. The gene network analyses identified 34 networks of genes associated with pulmonary function variables in Caucasians. Of those, the glycoprotein gene network reached genome-wide significance for all four variables. P-value range pmeta=6.29 × 10(-4 - 2.80 × 10(-8 on meta-analysis. In this study, we report on specific pathways that are significantly associated with pediatric lung function at genome-wide significance. In addition, we report the first loci associated with lung function in both pediatric Caucasian and African American populations.

  19. Genome-wide association scan meta-analysis identifies three Loci influencing adiposity and fat distribution.

    Directory of Open Access Journals (Sweden)

    Cecilia M Lindgren

    2009-06-01

    Full Text Available To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580 informative for adult waist circumference (WC and waist-hip ratio (WHR. We selected 26 SNPs for follow-up, for which the evidence of association with measures of central adiposity (WC and/or WHR was strong and disproportionate to that for overall adiposity or height. Follow-up studies in a maximum of 70,689 individuals identified two loci strongly associated with measures of central adiposity; these map near TFAP2B (WC, P = 1.9x10(-11 and MSRA (WC, P = 8.9x10(-9. A third locus, near LYPLAL1, was associated with WHR in women only (P = 2.6x10(-8. The variants near TFAP2B appear to influence central adiposity through an effect on overall obesity/fat-mass, whereas LYPLAL1 displays a strong female-only association with fat distribution. By focusing on anthropometric measures of central obesity and fat distribution, we have identified three loci implicated in the regulation of human adiposity.

  20. Quantitative bioassay to identify antimicrobial drugs through drug interaction fingerprint analysis.

    Science.gov (United States)

    Weinstein, Zohar B; Zaman, Muhammad H

    2017-02-16

    Drug interaction analysis, which reports the extent to which the presence of one drug affects the efficacy of another, is a powerful tool to select potent combinatorial therapies and predict connectivity between cellular components. Combinatorial effects of drug pairs often vary even for drugs with similar mechanism of actions. Therefore, drug interaction fingerprinting may be harnessed to differentiate drug identities. We developed a method to analyze drug interactions for the application of identifying active pharmaceutical ingredients, an essential step to assess drug quality. We developed a novel approach towards the identification of active pharmaceutical ingredients by comparing drug interaction fingerprint similarity metrics such as correlation and Euclidean distance. To expedite this method, we used bioluminescent E. coli in a simplified checkerboard assay to generate unique drug interaction fingerprints of antimicrobial drugs. Of 30 antibiotics studied, 29 could be identified based on their drug interaction fingerprints. We present drug interaction fingerprint analysis as a cheap, sensitive and quantitative method towards substandard and counterfeit drug detection.

  1. Pathway analysis of smoking quantity in multiple GWAS identifies cholinergic and sensory pathways.

    Directory of Open Access Journals (Sweden)

    Oscar Harari

    Full Text Available Cigarette smoking is a common addiction that increases the risk for many diseases, including lung cancer and chronic obstructive pulmonary disease. Genome-wide association studies (GWAS have successfully identified and validated several susceptibility loci for nicotine consumption and dependence. However, the trait variance explained by these genes is only a small fraction of the estimated genetic risk. Pathway analysis complements single marker methods by including biological knowledge into the evaluation of GWAS, under the assumption that causal variants lie in functionally related genes, enabling the evaluation of a broad range of signals. Our approach to the identification of pathways enriched for multiple genes associated with smoking quantity includes the analysis of two studies and the replication of common findings in a third dataset. This study identified pathways for the cholinergic receptors, which included SNPs known to be genome-wide significant; as well as novel pathways, such as genes involved in the sensory perception of smell, that do not contain any single SNP that achieves that stringent threshold.

  2. Genome-wide association study meta-analysis identifies seven new rheumatoid arthritis risk loci

    Science.gov (United States)

    Stahl, Eli A.; Raychaudhuri, Soumya; Remmers, Elaine F.; Xie, Gang; Eyre, Stephen; Thomson, Brian P.; Li, Yonghong; Kurreeman, Fina A. S.; Zhernakova, Alexandra; Hinks, Anne; Guiducci, Candace; Chen, Robert; Alfredsson, Lars; Amos, Christopher I.; Ardlie, Kristin G.; Barton, Anne; Bowes, John; Brouwer, Elisabeth; Burtt, Noel P.; Catanese, Joseph J.; Coblyn, Jonathan; Coenen, Marieke JH; Costenbader, Karen H.; Criswell, Lindsey A.; Crusius, J. Bart A.; Cui, Jing; de Bakker, Paul I.W.; De Jager, Phillip L.; Ding, Bo; Emery, Paul; Flynn, Edward; Harrison, Pille; Hocking, Lynne J.; Huizinga, Tom W. J.; Kastner, Daniel L.; Ke, Xiayi; Lee, Annette T.; Liu, Xiangdong; Martin, Paul; Morgan, Ann W.; Padyukov, Leonid; Posthumus, Marcel D.; Radstake, Timothy RDJ; Reid, David M.; Seielstad, Mark; Seldin, Michael F.; Shadick, Nancy A.; Steer, Sophia; Tak, Paul P.; Thomson, Wendy; van der Helm-van Mil, Annette H. M.; van der Horst-Bruinsma, Irene E.; van der Schoot, C. Ellen; van Riel, Piet LCM; Weinblatt, Michael E.; Wilson, Anthony G.; Wolbink, Gert Jan; Wordsworth, Paul; Wijmenga, Cisca; Karlson, Elizabeth W.; Toes, Rene E. M.; de Vries, Niek; Begovich, Ann B.; Worthington, Jane; Siminovitch, Katherine A.; Gregersen, Peter K.; Klareskog, Lars; Plenge, Robert M.

    2014-01-01

    To identify novel genetic risk factors for rheumatoid arthritis (RA), we conducted a genome-wide association study (GWAS) meta-analysis of 5,539 autoantibody positive RA cases and 20,169 controls of European descent, followed by replication in an independent set of 6,768 RA cases and 8,806 controls. Of 34 SNPs selected for replication, 7 novel RA risk alleles were identified at genome-wide significance (P<5×10−8) in analysis of all 41,282 samples. The associated SNPs are near genes of known immune function, including IL6ST, SPRED2, RBPJ, CCR6, IRF5, and PXK. We also refined the risk alleles at two established RA risk loci (IL2RA and CCL21) and confirmed the association at AFF3. These new associations bring the total number of confirmed RA risk loci to 31 among individuals of European ancestry. An additional 11 SNPs replicated at P<0.05, many of which are validated autoimmune risk alleles, suggesting that most represent bona fide RA risk alleles. PMID:20453842

  3. In Vivo Imaging Techniques: A New Era for Histochemical Analysis

    Science.gov (United States)

    Busato, A.; Feruglio, P. Fumene; Parnigotto, P.P.; Marzola, P.; Sbarbati, A.

    2016-01-01

    In vivo imaging techniques can be integrated with classical histochemistry to create an actual histochemistry of water. In particular, Magnetic Resonance Imaging (MRI), an imaging technique primarily used as diagnostic tool in clinical/preclinical research, has excellent anatomical resolution, unlimited penetration depth and intrinsic soft tissue contrast. Thanks to the technological development, MRI is not only capable to provide morphological information but also and more interestingly functional, biophysical and molecular. In this paper we describe the main features of several advanced imaging techniques, such as MRI microscopy, Magnetic Resonance Spectroscopy, functional MRI, Diffusion Tensor Imaging and MRI with contrast agent as a useful support to classical histochemistry. PMID:28076937

  4. Image analysis techniques associated with automatic data base generation.

    Science.gov (United States)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  5. Proteomic analysis identifies interleukin 11 regulated plasma membrane proteins in human endometrial epithelial cells in vitro

    Directory of Open Access Journals (Sweden)

    Stanton Peter G

    2011-05-01

    Full Text Available Abstract Background During the peri-implantation period, the embryo adheres to an adequately prepared or receptive endometrial surface epithelium. Abnormal embryo adhesion to the endometrium results in embryo implantation failure and infertility. Endometrial epithelial cell plasma membrane proteins critical in regulating adhesion may potentially be infertility biomarkers or targets for treating infertility. Interleukin (IL 11 regulates human endometrial epithelial cells (hEEC adhesion. Its production is abnormal in women with infertility. The objective of the study was to identify IL11 regulated plasma membrane proteins in hEEC in vitro using a proteomic approach. Methods Using a 2D-differential in-gel electrophoresis (DIGE electrophoresis combined with LCMS/MS mass spectrometry approach, we identified 20 unique plasma membrane proteins differentially regulated by IL11 in ECC-1 cells, a hEEC derived cell line. Two IL11 regulated proteins with known roles in cell adhesion, annexin A2 (ANXA2 and flotillin-1 (FLOT1, were validated by Western blot and immunocytochemistry in hEEC lines (ECC-1 and an additional cell line, Ishikawa and primary hEEC. Flotilin-1 was further validated by immunohistochemistry in human endometrium throughout the menstrual cycle (n = 6-8/cycle. Results 2D-DIGE analysis identified 4 spots that were significantly different between control and IL11 treated group. Of these 4 spots, there were 20 proteins that were identified with LCMS/MS. Two proteins; ANXA2 and FLOT1 were chosen for further analyses and have found to be significantly up-regulated following IL11 treatment. Western blot analysis showed a 2-fold and a 2.5-fold increase of ANXA2 in hEEC membrane fraction of ECC-1 and Ishikawa cells respectively. Similarly, a 1.8-fold and a 2.3/2.4-fold increase was also observed for FLOT1 in hEEC membrane fraction of ECC-1 and Ishikawa cells respectively. In vitro, IL11 induced stronger ANXA2 expression on cell surface of primary h

  6. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  7. RAPD analysis : a rapid technique for differentation of spoilage yeasts

    NARCIS (Netherlands)

    Baleiras Couto, M.M.; Vossen, J.M.B.M. van der; Hofstra, H.; Huis in 't Veld, J.H.J.

    1994-01-01

    Techniques for the identification of the spoilage yeasts Saccharomyces cerevisiae and members of the Zygosaccharomyces genus from food and beverages sources were evaluated. The use of identification systems based on physiological characteristics resulted often in incomplete identification or misiden

  8. Transcriptomic analysis using olive varieties and breeding progenies identify candidate genes involved in plant architecture

    Directory of Open Access Journals (Sweden)

    Juan José eGonzález Plaza

    2016-03-01

    Full Text Available Plant architecture is a critical trait in fruit crops that can significantly influence yield, pruning, planting density and harvesting. Little is known about how plant architecture is genetically determined in olive, were most of the existing varieties are traditional with an architecture poorly suited for modern growing and harvesting systems. In the present study, we have carried out microarray analysis of meristematic tissue to compare expression profiles of olive varieties displaying differences in architecture, as well as seedlings from their cross pooled on the basis of their sharing architecture-related phenotypes. The microarray used, previously developed by our group has already been applied to identify candidates genes involved in regulating juvenile to adult transition in the shoot apex of seedlings. Varieties with distinct architecture phenotypes and individuals from segregating progenies displaying opposite architecture features were used to link phenotype to expression. Here, we identify 2,252 differentially expressed genes associated to differences in plant architecture. Microarray results were validated by quantitative RT-PCR carried out on genes with functional annotation likely related to plant architecture. Twelve of these genes were further analyzed in individual seedlings of the corresponding pool. We also examined Arabidopsis mutants in putative orthologs of these targeted candidate genes, finding altered architecture for most of them. This supports a functional conservation between species and potential biological relevance of the candidate genes identified. This study is the first to identify genes associated to plant architecture in olive, and the results obtained could be of great help in future programs aimed at selecting phenotypes adapted to modern cultivation practices in this species.

  9. Transcriptomic Analysis Using Olive Varieties and Breeding Progenies Identifies Candidate Genes Involved in Plant Architecture.

    Science.gov (United States)

    González-Plaza, Juan J; Ortiz-Martín, Inmaculada; Muñoz-Mérida, Antonio; García-López, Carmen; Sánchez-Sevilla, José F; Luque, Francisco; Trelles, Oswaldo; Bejarano, Eduardo R; De La Rosa, Raúl; Valpuesta, Victoriano; Beuzón, Carmen R

    2016-01-01

    Plant architecture is a critical trait in fruit crops that can significantly influence yield, pruning, planting density and harvesting. Little is known about how plant architecture is genetically determined in olive, were most of the existing varieties are traditional with an architecture poorly suited for modern growing and harvesting systems. In the present study, we have carried out microarray analysis of meristematic tissue to compare expression profiles of olive varieties displaying differences in architecture, as well as seedlings from their cross pooled on the basis of their sharing architecture-related phenotypes. The microarray used, previously developed by our group has already been applied to identify candidates genes involved in regulating juvenile to adult transition in the shoot apex of seedlings. Varieties with distinct architecture phenotypes and individuals from segregating progenies displaying opposite architecture features were used to link phenotype to expression. Here, we identify 2252 differentially expressed genes (DEGs) associated to differences in plant architecture. Microarray results were validated by quantitative RT-PCR carried out on genes with functional annotation likely related to plant architecture. Twelve of these genes were further analyzed in individual seedlings of the corresponding pool. We also examined Arabidopsis mutants in putative orthologs of these targeted candidate genes, finding altered architecture for most of them. This supports a functional conservation between species and potential biological relevance of the candidate genes identified. This study is the first to identify genes associated to plant architecture in olive, and the results obtained could be of great help in future programs aimed at selecting phenotypes adapted to modern cultivation practices in this species.

  10. Platelet-Related Variants Identified by Exomechip Meta-analysis in 157,293 Individuals.

    Science.gov (United States)

    Eicher, John D; Chami, Nathalie; Kacprowski, Tim; Nomura, Akihiro; Chen, Ming-Huei; Yanek, Lisa R; Tajuddin, Salman M; Schick, Ursula M; Slater, Andrew J; Pankratz, Nathan; Polfus, Linda; Schurmann, Claudia; Giri, Ayush; Brody, Jennifer A; Lange, Leslie A; Manichaikul, Ani; Hill, W David; Pazoki, Raha; Elliot, Paul; Evangelou, Evangelos; Tzoulaki, Ioanna; Gao, He; Vergnaud, Anne-Claire; Mathias, Rasika A; Becker, Diane M; Becker, Lewis C; Burt, Amber; Crosslin, David R; Lyytikäinen, Leo-Pekka; Nikus, Kjell; Hernesniemi, Jussi; Kähönen, Mika; Raitoharju, Emma; Mononen, Nina; Raitakari, Olli T; Lehtimäki, Terho; Cushman, Mary; Zakai, Neil A; Nickerson, Deborah A; Raffield, Laura M; Quarells, Rakale; Willer, Cristen J; Peloso, Gina M; Abecasis, Goncalo R; Liu, Dajiang J; Deloukas, Panos; Samani, Nilesh J; Schunkert, Heribert; Erdmann, Jeanette; Fornage, Myriam; Richard, Melissa; Tardif, Jean-Claude; Rioux, John D; Dube, Marie-Pierre; de Denus, Simon; Lu, Yingchang; Bottinger, Erwin P; Loos, Ruth J F; Smith, Albert Vernon; Harris, Tamara B; Launer, Lenore J; Gudnason, Vilmundur; Velez Edwards, Digna R; Torstenson, Eric S; Liu, Yongmei; Tracy, Russell P; Rotter, Jerome I; Rich, Stephen S; Highland, Heather M; Boerwinkle, Eric; Li, Jin; Lange, Ethan; Wilson, James G; Mihailov, Evelin; Mägi, Reedik; Hirschhorn, Joel; Metspalu, Andres; Esko, Tõnu; Vacchi-Suzzi, Caterina; Nalls, Mike A; Zonderman, Alan B; Evans, Michele K; Engström, Gunnar; Orho-Melander, Marju; Melander, Olle; O'Donoghue, Michelle L; Waterworth, Dawn M; Wallentin, Lars; White, Harvey D; Floyd, James S; Bartz, Traci M; Rice, Kenneth M; Psaty, Bruce M; Starr, J M; Liewald, David C M; Hayward, Caroline; Deary, Ian J; Greinacher, Andreas; Völker, Uwe; Thiele, Thomas; Völzke, Henry; van Rooij, Frank J A; Uitterlinden, André G; Franco, Oscar H; Dehghan, Abbas; Edwards, Todd L; Ganesh, Santhi K; Kathiresan, Sekar; Faraday, Nauder; Auer, Paul L; Reiner, Alex P; Lettre, Guillaume; Johnson, Andrew D

    2016-07-01

    Platelet production, maintenance, and clearance are tightly controlled processes indicative of platelets' important roles in hemostasis and thrombosis. Platelets are common targets for primary and secondary prevention of several conditions. They are monitored clinically by complete blood counts, specifically with measurements of platelet count (PLT) and mean platelet volume (MPV). Identifying genetic effects on PLT and MPV can provide mechanistic insights into platelet biology and their role in disease. Therefore, we formed the Blood Cell Consortium (BCX) to perform a large-scale meta-analysis of Exomechip association results for PLT and MPV in 157,293 and 57,617 individuals, respectively. Using the low-frequency/rare coding variant-enriched Exomechip genotyping array, we sought to identify genetic variants associated with PLT and MPV. In addition to confirming 47 known PLT and 20 known MPV associations, we identified 32 PLT and 18 MPV associations not previously observed in the literature across the allele frequency spectrum, including rare large effect (FCER1A), low-frequency (IQGAP2, MAP1A, LY75), and common (ZMIZ2, SMG6, PEAR1, ARFGAP3/PACSIN2) variants. Several variants associated with PLT/MPV (PEAR1, MRVI1, PTGES3) were also associated with platelet reactivity. In concurrent BCX analyses, there was overlap of platelet-associated variants with red (MAP1A, TMPRSS6, ZMIZ2) and white (PEAR1, ZMIZ2, LY75) blood cell traits, suggesting common regulatory pathways with shared genetic architecture among these hematopoietic lineages. Our large-scale Exomechip analyses identified previously undocumented associations with platelet traits and further indicate that several complex quantitative hematological, lipid, and cardiovascular traits share genetic factors.

  11. Analysis of filtering techniques and image quality in pixel duplicated images

    Science.gov (United States)

    Mehrubeoglu, Mehrube; McLauchlan, Lifford

    2009-08-01

    When images undergo filtering operations, valuable information can be lost besides the intended noise or frequencies due to averaging of neighboring pixels. When the image is enlarged by duplicating pixels, such filtering effects can be reduced and more information retained, which could be critical when analyzing image content automatically. Analysis of retinal images could reveal many diseases at early stage as long as minor changes that depart from a normal retinal scan can be identified and enhanced. In this paper, typical filtering techniques are applied to an early stage diabetic retinopathy image which has undergone digital pixel duplication. The same techniques are applied to the original images for comparison. The effects of filtering are then demonstrated for both pixel duplicated and original images to show the information retention capability of pixel duplication. Image quality is computed based on published metrics. Our analysis shows that pixel duplication is effective in retaining information on smoothing operations such as mean filtering in the spatial domain, as well as lowpass and highpass filtering in the frequency domain, based on the filter window size. Blocking effects due to image compression and pixel duplication become apparent in frequency analysis.

  12. Reticle defect sizing of optical proximity correction defects using SEM imaging and image analysis techniques

    Science.gov (United States)

    Zurbrick, Larry S.; Wang, Lantian; Konicek, Paul; Laird, Ellen R.

    2000-07-01

    Sizing of programmed defects on optical proximity correction (OPC) feature sis addressed using high resolution scanning electron microscope (SEM) images and image analysis techniques. A comparison and analysis of different sizing methods is made. This paper addresses the issues of OPC defect definition and discusses the experimental measurement results obtained by SEM in combination with image analysis techniques.

  13. Co-expression Analysis Identifies CRC and AP1 the Regulator of Arabidopsis Fatty Acid Biosynthesis

    Institute of Scientific and Technical Information of China (English)

    Xinxin Han; Linlin Yin; Hongwei Xue

    2012-01-01

    Fatty acids (FAs) play crucial rules in signal transduction and plant development,however,the regulation of FA metabolism is still poorly understood.To study the relevant regulatory network,fifty-eight FA biosynthesis genes including de novo synthases,desaturases and elongases were selected as "guide genes" to construct the co-expression network.Calculation of the correlation between all Arabidopsis thaliana (L.) genes with each guide gene by Arabidopsis co-expression dating mining tools (ACT)identifies 797 candidate FA-correlated genes.Gene ontology (GO) analysis of these co-expressed genes showed they are tightly correlated to photosynthesis and carbohydrate metabolism,and function in many processes.Interestingly,63 transcription factors (TFs) were identified as candidate FA biosynthesis regulators and 8 TF families are enriched.Two TF genes,CRC and AP1,both correlating with 8 FA guide genes,were further characterized.Analyses of the ap1 and crc mutant showed the altered total FA composition of mature seeds.The contents of palmitoleic acid,stearic acid,arachidic acid and eicosadienoic acid are decreased,whereas that of oleic acid is increased in ap1 and crc seeds,which is consistent with the qRT-PCR analysis revealing the suppressed expression of the corresponding guide genes.In addition,yeast one-hybrid analysis and electrophoretic mobility shift assay (EMSA) revealed that CRC can bind to the promoter regions of KCS7 and KCS15,indicating that CRC may directly regulate FA biosynthesis.

  14. Identifying and Tracking Individual Updraft Cores using Cluster Analysis: A TWP-ICE case study

    Science.gov (United States)

    Li, X.; Tao, W.; Collis, S. M.; Varble, A.

    2013-12-01

    Cumulus parameterizations in GCMs depend strongly on the vertical velocity structures of convective updraft cores, or plumes. There hasn't been an accurate way of identifying these cores. The majority of previous studies treat the updraft as a single grid column entity, thus missing many intrinsic characteristics, e.g., the size, strength and spatial orientation of an individual core, its life cycle, and the time variations of the entrainment/detrainment rates associated with its life cycle. In this study, we attempt to apply an innovative algorithm based on the centroid-based k-means cluster analysis to improve our understanding of convection and its associated updraft cores. Both 3-D Doppler radar retrievals and cloud-resolving model simulations of a TWP-ICE campaign case during the monsoon period will be used to test and improve this algorithm. This will provide for more in-depth comparisons between CRM simulations and observations that were not possible previously using the traditional piecewise analysis with each updraft column. The first step is to identify the strongest cores (maximum velocity >10 m/s), since they are well defined and produce definite answers when the cluster analysis algorithm is applied. The preliminary results show that the radar retrieved updraft cores are smaller in size and with the maximum velocity located uniformly at higher levels compared with model simulations. Overall, the model simulations produce much stronger cores compared with the radar retrievals. Within the model simulations, the bulk microphysical scheme simulation produces stronger cores than the spectral bin microphysical scheme. Planned researches include using high temporal-resolution simulations to further track the life cycle of individual updraft cores and study their characteristics.

  15. Utilizing a Photo-Analysis Software for Content Identifying Method (CIM

    Directory of Open Access Journals (Sweden)

    Nejad Nasim Sahraei

    2015-01-01

    Full Text Available Content Identifying Methodology or (CIM was developed to measure public preferences in order to reveal the common characteristics of landscapes and aspects of underlying perceptions including the individual's reactions to content and spatial configuration, therefore, it can assist with the identification of factors that influenced preference. Regarding the analysis of landscape photographs through CIM, there are several studies utilizing image analysis software, such as Adobe Photoshop, in order to identify the physical contents in the scenes. This study attempts to evaluate public’s ‘preferences for aesthetic qualities of pedestrian bridges in urban areas through a photo-questionnaire survey, in which respondents evaluated images of pedestrian bridges in urban areas. Two groups of images were evaluated as the most and least preferred scenes that concern the highest and lowest mean scores respectively. These two groups were analyzed by CIM and also evaluated based on the respondent’s description of each group to reveal the pattern of preferences and the factors that may affect them. Digimizer Software was employed to triangulate the two approaches and to determine the role of these factors on people’s preferences. This study attempts to introduce the useful software for image analysis which can measure the physical contents and also their spatial organization in the scenes. According to the findings, it is revealed that Digimizer could be a useful tool in CIM approaches through preference studies that utilizes photographs in place of the actual landscape in order to determine the most important factors in public preferences for pedestrian bridges in urban areas.

  16. Identifying regions of strong scattering at the core-mantle boundary from analysis of PKKP precursor energy

    Science.gov (United States)

    Rost, S.; Earle, P.S.

    2010-01-01

    We detect seismic scattering from the core-mantle boundary related to the phase PKKP (PK. KP) in data from small aperture seismic arrays in India and Canada. The detection of these scattered waves in data from small aperture arrays is new and allows a better characterization of the fine-scale structure of the deep Earth especially in the southern hemisphere. Their slowness vector is determined from array processing allowing location of the heterogeneities at the core-mantle boundary using back-projection techniques through 1D Earth models. We identify strong scattering at the core-mantle boundary (CMB) beneath the Caribbean, Patagonia and the Antarctic Peninsula as well as beneath southern Africa. An analysis of the scattering regions relative to sources and receivers indicates that these regions represent areas of increased scattering likely due to increased heterogeneities close to the CMB. The 1. Hz array data used in this study is most sensitive to heterogeneity with scale lengths of about 10. km. Given the small size of the scatterers, a chemical origin of the heterogeneities is likely. By comparing the location of the fine-scale heterogeneity to geodynamical models and tomographic images, we identify different scattering mechanisms in regions related to subduction (Caribbean and Patagonia) and dense thermo chemical piles (Southern Africa). ?? 2010 Elsevier B.V.

  17. Genome-wide analysis of over 106 000 individuals identifies 9 neuroticism-associated loci.

    Science.gov (United States)

    Smith, D J; Escott-Price, V; Davies, G; Bailey, M E S; Colodro-Conde, L; Ward, J; Vedernikov, A; Marioni, R; Cullen, B; Lyall, D; Hagenaars, S P; Liewald, D C M; Luciano, M; Gale, C R; Ritchie, S J; Hayward, C; Nicholl, B; Bulik-Sullivan, B; Adams, M; Couvy-Duchesne, B; Graham, N; Mackay, D; Evans, J; Smith, B H; Porteous, D J; Medland, S E; Martin, N G; Holmans, P; McIntosh, A M; Pell, J P; Deary, I J; O'Donovan, M C

    2016-06-01

    Neuroticism is a personality trait of fundamental importance for psychological well-being and public health. It is strongly associated with major depressive disorder (MDD) and several other psychiatric conditions. Although neuroticism is heritable, attempts to identify the alleles involved in previous studies have been limited by relatively small sample sizes. Here we report a combined meta-analysis of genome-wide association study (GWAS) of neuroticism that includes 91 370 participants from the UK Biobank cohort, 6659 participants from the Generation Scotland: Scottish Family Health Study (GS:SFHS) and 8687 participants from a QIMR (Queensland Institute of Medical Research) Berghofer Medical Research Institute (QIMR) cohort. All participants were assessed using the same neuroticism instrument, the Eysenck Personality Questionnaire-Revised (EPQ-R-S) Short Form's Neuroticism scale. We found a single-nucleotide polymorphism-based heritability estimate for neuroticism of ∼15% (s.e.=0.7%). Meta-analysis identified nine novel loci associated with neuroticism. The strongest evidence for association was at a locus on chromosome 8 (P=1.5 × 10(-15)) spanning 4 Mb and containing at least 36 genes. Other associated loci included interesting candidate genes on chromosome 1 (GRIK3 (glutamate receptor ionotropic kainate 3)), chromosome 4 (KLHL2 (Kelch-like protein 2)), chromosome 17 (CRHR1 (corticotropin-releasing hormone receptor 1) and MAPT (microtubule-associated protein Tau)) and on chromosome 18 (CELF4 (CUGBP elav-like family member 4)). We found no evidence for genetic differences in the common allelic architecture of neuroticism by sex. By comparing our findings with those of the Psychiatric Genetics Consortia, we identified a strong genetic correlation between neuroticism and MDD and a less strong but significant genetic correlation with schizophrenia, although not with bipolar disorder. Polygenic risk scores derived from the primary UK Biobank sample captured

  18. Analysis of chlorocarbon compounds identified in the SAM Investigation of the Mars Science Laboratory mission

    Science.gov (United States)

    Freissinet, Caroline; Mahaffy, P.; Glavin, D.; Buch, A.; Brunner, A.; Eigenbrode, J.; Martin, M.; Miller, K.; Steele, A.; Szopa, C.; SAM; MSL science Team

    2013-10-01

    The gas chromatograph mass spectrometer (GCMS) mode of the Sample Analysis at Mars (SAM) experiment was designed for the separation and identification of the chemical components of the gases released from a solid sample or trapped from the atmosphere. Gases from solid samples are either produced by heating a cell from ambient to >800-1100oC (EGA mode) or by wet chemistry extraction and reactions (not yet employed on Mars). Prior to EGA analysis of portions of the first 3 solid samples (Rocknest, John Klein and Cumberland) collected by MSL and delivered to SAM, an internal SAM blank run was carried out with an empty quartz cup. These blank analyses are required to understand the background signal intrinsic to the GCMS and its gas manifolds and traps. Several peaks have been identified as part of SAM background, some of them below the nmol level, which attests of the sensitivity of the instrument and as-designed performance of the GCMS. The origin of each peak has been investigated, and two major contributors are revealed; residual vapor from one of the chemicals used for SAM wet chemistry experiment: N-methyl-N-tert-butyldimethylsilyl-trifluoroacetamide (MTBSTFA), and the Tenax from the hydrocarbon trap. Supporting lab experiments are in progress to understand the reaction pathways of the molecules identified in the SAM background. These experiments help elucidate which molecules may be interpreted as indigenous to Mars. Of the three solid samples analyzed on 11 runs, it was possible to detect and identify several chlorinated compounds including several chlorohydrocarbons. The chlorine is likely derived from the decomposition of martian perchlorates or other indigenous Cl-containing species while the origin of the carbon is presently under investigation for each detected molecule. To date, a subset these molecules have been identified in lab studies and a terrestrial contribution to the observed products are more easily explained. The combined results from SAM and

  19. How can the polar dome be identified in meteorological analysis model data?

    Science.gov (United States)

    Kunkel, Daniel; Bozem, Heiko; Gutmann, Robert; Hoor, Peter

    2016-04-01

    The thermal stratification of the lower atmosphere at high latitudes causes an isolation of polar regions from lower latitudes. A transport barrier establishes in the region where isentropic surfaces slope upward from near surface to higher altitudes. This barrier is also known as the polar dome. For adiabatic flow the transport of air masses from midlatitudes into high latitudes occurs almost along the isentropic surfaces. Only diabatic processes related to clouds, radiation, or turbulence can foster a transport across the barrier. Such processes can be identified by the material rate of change of potential temperature which have to occur in the vicinity of the polar dome. Thus, to identify regions of exchange, it is first crucial to know where the transport barrier is located. The question arises then which meteorological variables may be suited to identify the location of this transport barrier. A second question is how the shape of the polar dome changes during different time periods of the year? For this we use gridded analysis model data from the European Center for Medium-Range Weather Forecast (ECMWF) with high spatial resolution for several time periods during 2014 and 2015. Especially, we focus on time periods during spring and summer when extensive in-situ measurement campaigns took place in the high Arctic. We define four metrics to identify the location, i.e., the latitude, of the transport barrier at various altitudes, e.g., the surface or a surface of constant pressure in the lower troposphere. These metrics are based on (1) a constant value of potential temperature that intersects a given altitude, (2) the strongest gradient of potential temperature on a given altitude level, and (3) the relative difference between equivalent potential temperature and potential temperature at the surface. The last metric is based on a Lagrangian analysis for which ten days forward and backward trajectories are calculated, starting at each grid point between 45

  20. Genome-wide analysis of over 106 000 individuals identifies 9 neuroticism-associated loci

    Science.gov (United States)

    Smith, D J; Escott-Price, V; Davies, G; Bailey, M E S; Colodro-Conde, L; Ward, J; Vedernikov, A; Marioni, R; Cullen, B; Lyall, D; Hagenaars, S P; Liewald, D C M; Luciano, M; Gale, C R; Ritchie, S J; Hayward, C; Nicholl, B; Bulik-Sullivan, B; Adams, M; Couvy-Duchesne, B; Graham, N; Mackay, D; Evans, J; Smith, B H; Porteous, D J; Medland, S E; Martin, N G; Holmans, P; McIntosh, A M; Pell, J P; Deary, I J; O'Donovan, M C

    2016-01-01

    Neuroticism is a personality trait of fundamental importance for psychological well-being and public health. It is strongly associated with major depressive disorder (MDD) and several other psychiatric conditions. Although neuroticism is heritable, attempts to identify the alleles involved in previous studies have been limited by relatively small sample sizes. Here we report a combined meta-analysis of genome-wide association study (GWAS) of neuroticism that includes 91 370 participants from the UK Biobank cohort, 6659 participants from the Generation Scotland: Scottish Family Health Study (GS:SFHS) and 8687 participants from a QIMR (Queensland Institute of Medical Research) Berghofer Medical Research Institute (QIMR) cohort. All participants were assessed using the same neuroticism instrument, the Eysenck Personality Questionnaire-Revised (EPQ-R-S) Short Form's Neuroticism scale. We found a single-nucleotide polymorphism-based heritability estimate for neuroticism of ∼15% (s.e.=0.7%). Meta-analysis identified nine novel loci associated with neuroticism. The strongest evidence for association was at a locus on chromosome 8 (P=1.5 × 10−15) spanning 4 Mb and containing at least 36 genes. Other associated loci included interesting candidate genes on chromosome 1 (GRIK3 (glutamate receptor ionotropic kainate 3)), chromosome 4 (KLHL2 (Kelch-like protein 2)), chromosome 17 (CRHR1 (corticotropin-releasing hormone receptor 1) and MAPT (microtubule-associated protein Tau)) and on chromosome 18 (CELF4 (CUGBP elav-like family member 4)). We found no evidence for genetic differences in the common allelic architecture of neuroticism by sex. By comparing our findings with those of the Psychiatric Genetics Consortia, we identified a strong genetic correlation between neuroticism and MDD and a less strong but significant genetic correlation with schizophrenia, although not with bipolar disorder. Polygenic risk scores derived from the primary UK Biobank sample captured

  1. Supervised accelerometry analysis can identify prey capture by penguins at sea.

    Science.gov (United States)

    Carroll, Gemma; Slip, David; Jonsen, Ian; Harcourt, Rob

    2014-12-15

    Determining where, when and how much animals eat is fundamental to understanding their ecology. We developed a technique to identify a prey capture signature for little penguins from accelerometry, in order to quantify food intake remotely. We categorised behaviour of captive penguins from HD video and matched this to time-series data from back-mounted accelerometers. We then trained a support vector machine (SVM) to classify the penguins' behaviour at 0.3 s intervals as either 'prey handling' or 'swimming'. We applied this model to accelerometer data collected from foraging wild penguins to identify prey capture events. We compared prey capture and non-prey capture dives to test the model predictions against foraging theory. The SVM had an accuracy of 84.95±0.26% (mean ± s.e.) and a false positive rate of 9.82±0.24% when tested on unseen captive data. For wild data, we defined three independent, consecutive prey handling observations as representing true prey capture, with a false positive rate of 0.09%. Dives with prey captures had longer duration and bottom times, were deeper, had faster ascent rates, and had more 'wiggles' and 'dashes' (proxies for prey encounter used in other studies). The mean (±s.e.) number of prey captures per foraging trip was 446.6±66.28. By recording the behaviour of captive animals on HD video and using a supervised machine learning approach, we show that accelerometry signatures can classify the behaviour of wild animals at unprecedentedly fine scales.

  2. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks’ back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps’ detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523

  3. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Directory of Open Access Journals (Sweden)

    Ahmad Karim

    Full Text Available Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS, disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  4. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  5. Evolutionary analysis of vision genes identifies potential drivers of visual differences between giraffe and okapi

    Science.gov (United States)

    Agaba, Morris; Cavener, Douglas R.

    2017-01-01

    Background The capacity of visually oriented species to perceive and respond to visual signal is integral to their evolutionary success. Giraffes are closely related to okapi, but the two species have broad range of phenotypic differences including their visual capacities. Vision studies rank giraffe’s visual acuity higher than all other artiodactyls despite sharing similar vision ecological determinants with many of them. The extent to which the giraffe’s unique visual capacity and its difference with okapi is reflected by changes in their vision genes is not understood. Methods The recent availability of giraffe and okapi genomes provided opportunity to identify giraffe and okapi vision genes. Multiple strategies were employed to identify thirty-six candidate mammalian vision genes in giraffe and okapi genomes. Quantification of selection pressure was performed by a combination of branch-site tests of positive selection and clade models of selection divergence through comparing giraffe and okapi vision genes and orthologous sequences from other mammals. Results Signatures of selection were identified in key genes that could potentially underlie giraffe and okapi visual adaptations. Importantly, some genes that contribute to optical transparency of the eye and those that are critical in light signaling pathway were found to show signatures of adaptive evolution or selection divergence. Comparison between giraffe and other ruminants identifies significant selection divergence in CRYAA and OPN1LW. Significant selection divergence was identified in SAG while positive selection was detected in LUM when okapi is compared with ruminants and other mammals. Sequence analysis of OPN1LW showed that at least one of the sites known to affect spectral sensitivity of the red pigment is uniquely divergent between giraffe and other ruminants. Discussion By taking a systemic approach to gene function in vision, the results provide the first molecular clues associated with

  6. FTIR Analysis of Alkali Activated Slag and Fly Ash Using Deconvolution Techniques

    Science.gov (United States)

    Madavarapu, Sateesh Babu

    The studies on aluminosilicate materials to replace traditional construction materials such as ordinary Portland cement (OPC) to reduce the effects caused has been an important research area for the past decades. Many properties like strength have already been studied and the primary focus is to learn about the reaction mechanism and the effect of the parameters on the formed products. The aim of this research was to explore the structural changes and reaction product analysis of geopolymers (Slag & Fly Ash) using Fourier transform infrared spectroscopy (FTIR) and deconvolution techniques. Spectroscopic techniques give valuable information at a molecular level but not all methods are economic and simple. To understand the mechanisms of alkali activated aluminosilicate materials, attenuated total reflectance (ATR) FTIR has been used where the effect of the parameters on the reaction products have been analyzed. To analyze complex systems like geopolymers using FTIR, deconvolution techniques help to obtain the properties of a particular peak attributed to a certain molecular vibration. Time and temperature dependent analysis were done on slag pastes to understand the polymerization of reactive silica in the system with time and temperature variance. For time dependent analysis slag has been activated with sodium and potassium silicates using two different `n'values and three different silica modulus [Ms- (SiO2 /M2 O)] values. The temperature dependent analysis was done by curing the samples at 60°C and 80°C. Similarly fly ash has been studied by activating with alkali hydroxides and alkali silicates. Under the same curing conditions the fly ash samples were evaluated to analyze the effects of added silicates for alkali activation. The peak shifts in the FTIR explains the changes in the structural nature of the matrix and can be identified using the deconvolution technique. A strong correlation is found between the concentrations of silicate monomer in the

  7. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  8. Evaluation of soil gas sampling and analysis techniques at a former petrochemical plant site.

    Science.gov (United States)

    Hers, I; Li, L; Hannam, S

    2004-07-01

    Methods for soil gas sampling and analysis are evaluated as part of a research study on soil vapour intrusion into buildings, conducted at a former petro-chemical plant site ("Chatterton site"). The evaluation process was designed to provide information on reliability and selection of appropriate methods for soil gas sampling and analysis, and was based on a literature review of data and methods, and experiments completed as part of the research study. The broader context of this work is that soil gas characterization is increasingly being used for input into risk assessment of contaminated sites, particularly when evaluating the potential intrusion of soil vapour into buildings. There are only a limited number of research studies and protocols addressing soil gas sampling and analysis. There is significant variability in soil gas probe design and sample collection and analysis methods used by practitioners. The experimental studies conducted to evaluate soil gas methods address the permeation or leakage of gases from Tedlar bags, time-dependent sorption of volatile organic compound (VOC)-vapours onto probe surfaces and sampling devices, and analytical and quality control issues for light gas and VOC analyses. Through this work, common techniques for soil gas collection and analysis are described together with implications for data quality arising from the different methods used. Some of the potential pitfalls that can affect soil gas testing are identified, and recommendations and guidance for improved protocols are provided.

  9. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  10. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  11. Integrative Genomic Analysis of Cholangiocarcinoma Identifies Distinct IDH-Mutant Molecular Profiles

    Directory of Open Access Journals (Sweden)

    Farshad Farshidfar

    2017-03-01

    Full Text Available Cholangiocarcinoma (CCA is an aggressive malignancy of the bile ducts, with poor prognosis and limited treatment options. Here, we describe the integrated analysis of somatic mutations, RNA expression, copy number, and DNA methylation by The Cancer Genome Atlas of a set of predominantly intrahepatic CCA cases and propose a molecular classification scheme. We identified an IDH mutant-enriched subtype with distinct molecular features including low expression of chromatin modifiers, elevated expression of mitochondrial genes, and increased mitochondrial DNA copy number. Leveraging the multi-platform data, we observed that ARID1A exhibited DNA hypermethylation and decreased expression in the IDH mutant subtype. More broadly, we found that IDH mutations are associated with an expanded histological spectrum of liver tumors with molecular features that stratify with CCA. Our studies reveal insights into the molecular pathogenesis and heterogeneity of cholangiocarcinoma and provide classification information of potential therapeutic significance.

  12. Genomic analysis of 38 Legionella species identifies large and diverse effector repertoires.

    Science.gov (United States)

    Burstein, David; Amaro, Francisco; Zusman, Tal; Lifshitz, Ziv; Cohen, Ofir; Gilbert, Jack A; Pupko, Tal; Shuman, Howard A; Segal, Gil

    2016-02-01

    Infection by the human pathogen Legionella pneumophila relies on the translocation of ∼ 300 virulence proteins, termed effectors, which manipulate host cell processes. However, almost no information exists regarding effectors in other Legionella pathogens. Here we sequenced, assembled and characterized the genomes of 38 Legionella species and predicted their effector repertoires using a previously validated machine learning approach. This analysis identified 5,885 predicted effectors. The effector repertoires of different Legionella species were found to be largely non-overlapping, and only seven core effectors were shared by all species studied. Species-specific effectors had atypically low GC content, suggesting exogenous acquisition, possibly from the natural protozoan hosts of these species. Furthermore, we detected numerous new conserved effector domains and discovered new domain combinations, which allowed the inference of as yet undescribed effector functions. The effector collection and network of domain architectures described here can serve as a roadmap for future studies of effector function and evolution.

  13. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist;

    2015-01-01

    exercise, thereby bypassing the challenging task of model structure determination and identification. Parameter identification problems can thus lead to ill-calibrated models with low predictive power and large model uncertainty. Every calibration exercise should therefore be precededby a proper model...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...

  14. Identifying time measurement tampering in the traversal time and hop count analysis (TTHCA) wormhole detection algorithm.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2013-05-17

    Traversal time and hop count analysis (TTHCA) is a recent wormhole detection algorithm for mobile ad hoc networks (MANET) which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM) wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm.

  15. Proteomic Analysis of Pichindé virus Infection Identifies Differential Expression of Prothymosin-α

    Directory of Open Access Journals (Sweden)

    Gavin C. Bowick

    2010-01-01

    Full Text Available The arenaviruses include a number of important pathogens including Lassa virus and Junin virus. Presently, the only treatment is supportive care and the antiviral Ribavirin. In the event of an epidemic, patient triage may be required to more effectively manage resources; the development of prognostic biomarker signatures, correlating with disease severity, would allow rational triage. Using a pair of arenaviruses, which cause mild or severe disease, we analyzed extracts from infected cells using SELDI mass spectrometry to characterize potential biomarker profiles. EDGE analysis was used to analyze longitudinal expression differences. Extracts from infected guinea pigs revealed protein peaks which could discriminate between mild or severe infection and between times post-infection. Tandem mass-spectrometry identified several peaks, including the transcriptional regulator prothymosin-α. Further investigation revealed differences in secretion of this peptide. These data show proof of concept that proteomic profiling of host markers could be used as prognostic markers of infectious disease.

  16. Reconstructability analysis as a tool for identifying gene-gene interactions in studies of human diseases.

    Science.gov (United States)

    Shervais, Stephen; Kramer, Patricia L; Westaway, Shawn K; Cox, Nancy J; Zwick, Martin

    2010-01-01

    There are a number of common human diseases for which the genetic component may include an epistatic interaction of multiple genes. Detecting these interactions with standard statistical tools is difficult because there may be an interaction effect, but minimal or no main effect. Reconstructability analysis (RA) uses Shannon's information theory to detect relationships between variables in categorical datasets. We applied RA to simulated data for five different models of gene-gene interaction, and find that even with heritability levels as low as 0.008, and with the inclusion of 50 non-associated genes in the dataset, we can identify the interacting gene pairs with an accuracy of > or =80%. We applied RA to a real dataset of type 2 non-insulin-dependent diabetes (NIDDM) cases and controls, and closely approximated the results of more conventional single SNP disease association studies. In addition, we replicated prior evidence for epistatic interactions between SNPs on chromosomes 2 and 15.

  17. Factor analysis of 27Al MAS NMR spectra for identifying nanocrystalline phases in amorphous geopolymers.

    Science.gov (United States)

    Urbanova, Martina; Kobera, Libor; Brus, Jiri

    2013-11-01

    Nanostructured materials offer enhanced physicochemical properties because of the large interfacial area. Typically, geopolymers with specifically synthesized nanosized zeolites are a promising material for the sorption of pollutants. The structural characterization of these aluminosilicates, however, continues to be a challenge. To circumvent complications resulting from the amorphous character of the aluminosilicate matrix and from the low concentrations of nanosized crystallites, we have proposed a procedure based on factor analysis of (27)Al MAS NMR spectra. The capability of the proposed method was tested on geopolymers that exhibited various tendencies to crystallize (i) completely amorphous systems, (ii) X-ray amorphous systems with nanocrystalline phases, and (iii) highly crystalline systems. Although the recorded (27)Al MAS NMR spectra did not show visible differences between the amorphous systems (i) and the geopolymers with the nanocrystalline phase (ii), the applied factor analysis unambiguously distinguished these materials. The samples were separated into the well-defined clusters, and the systems with the evolving crystalline phase were identified even before any crystalline fraction was detected by X-ray powder diffraction. Reliability of the proposed procedure was verified by comparing it with (29)Si MAS NMR spectra. Factor analysis of (27)Al MAS NMR spectra thus has the ability to reveal spectroscopic features corresponding to the nanocrystalline phases. Because the measurement time of (27)Al MAS NMR spectra is significantly shorter than that of (29)Si MAS NMR data, the proposed procedure is particularly suitable for the analysis of large sets of specifically synthesized geopolymers in which the formation of the limited fractions of nanocrystalline phases is desired.

  18. Bridging the gap between sample collection and laboratory analysis: using dried blood spots to identify human exposure to chemical agents

    Science.gov (United States)

    Hamelin, Elizabeth I.; Blake, Thomas A.; Perez, Jonas W.; Crow, Brian S.; Shaner, Rebecca L.; Coleman, Rebecca M.; Johnson, Rudolph C.

    2016-05-01

    Public health response to large scale chemical emergencies presents logistical challenges for sample collection, transport, and analysis. Diagnostic methods used to identify and determine exposure to chemical warfare agents, toxins, and poisons traditionally involve blood collection by phlebotomists, cold transport of biomedical samples, and costly sample preparation techniques. Use of dried blood spots, which consist of dried blood on an FDA-approved substrate, can increase analyte stability, decrease infection hazard for those handling samples, greatly reduce the cost of shipping/storing samples by removing the need for refrigeration and cold chain transportation, and be self-prepared by potentially exposed individuals using a simple finger prick and blood spot compatible paper. Our laboratory has developed clinical assays to detect human exposures to nerve agents through the analysis of specific protein adducts and metabolites, for which a simple extraction from a dried blood spot is sufficient for removing matrix interferents and attaining sensitivities on par with traditional sampling methods. The use of dried blood spots can bridge the gap between the laboratory and the field allowing for large scale sample collection with minimal impact on hospital resources while maintaining sensitivity, specificity, traceability, and quality requirements for both clinical and forensic applications.

  19. Using Principal Component Analysis to Identify Priority Neighbourhoods for Health Services Delivery by Ranking Socioeconomic Status

    Science.gov (United States)

    Friesen, Christine Elizabeth; Seliske, Patrick; Papadopoulos, Andrew

    2016-01-01

    Objectives. Socioeconomic status (SES) is a comprehensive indicator of health status and is useful in area-level health research and informing public health resource allocation. Principal component analysis (PCA) is a useful tool for developing SES indices to identify area-level disparities in SES within communities. While SES research in Canada has relied on census data, the voluntary nature of the 2011 National Household Survey challenges the validity of its data, especially income variables. This study sought to determine the appropriateness of replacing census income information with tax filer data in neighbourhood SES index development. Methods. Census and taxfiler data for Guelph, Ontario were retrieved for the years 2005, 2006, and 2011. Data were extracted for eleven income and non-income SES variables. PCA was employed to identify significant principal components from each dataset and weights of each contributing variable. Variable-specific factor scores were applied to standardized census and taxfiler data values to produce SES scores. Results. The substitution of taxfiler income variables for census income variables yielded SES score distributions and neighbourhood SES classifications that were similar to SES scores calculated using entirely census variables. Combining taxfiler income variables with census non-income variables also produced clearer SES level distinctions. Internal validation procedures indicated that utilizing multiple principal components produced clearer SES level distinctions than using only the first principal component. Conclusion. Identifying socioeconomic disparities between neighbourhoods is an important step in assessing the level of disadvantage of communities. The ability to replace census income information with taxfiler data to develop SES indices expands the versatility of public health research and planning in Canada, as more data sources can be explored. The apparent usefulness of PCA also contributes to the improvement

  20. Contextual Hub Analysis Tool (CHAT): A Cytoscape app for identifying contextually relevant hubs in biological networks

    Science.gov (United States)

    Wiencko, Heather L.; Bernal-Llinares, Manuel; Bryan, Kenneth; Lynn, David J.

    2016-01-01

    Highly connected nodes (hubs) in biological networks are topologically important to the structure of the network and have also been shown to be preferentially associated with a range of phenotypes of interest. The relative importance of a hub node, however, can change depending on the biological context. Here, we report a Cytoscape app, the Contextual Hub Analysis Tool (CHAT), which enables users to easily construct and visualize a network of interactions from a gene or protein list of interest, integrate contextual information, such as gene expression or mass spectrometry data, and identify hub nodes that are more highly connected to contextual nodes (e.g. genes or proteins that are differentially expressed) than expected by chance. In a case study, we use CHAT to construct a network of genes that are differentially expressed in Dengue fever, a viral infection. CHAT was used to identify and compare contextual and degree-based hubs in this network. The top 20 degree-based hubs were enriched in pathways related to the cell cycle and cancer, which is likely due to the fact that proteins involved in these processes tend to be highly connected in general. In comparison, the top 20 contextual hubs were enriched in pathways commonly observed in a viral infection including pathways related to the immune response to viral infection. This analysis shows that such contextual hubs are considerably more biologically relevant than degree-based hubs and that analyses which rely on the identification of hubs solely based on their connectivity may be biased towards nodes that are highly connected in general rather than in the specific context of interest. Availability: CHAT is available for Cytoscape 3.0+ and can be installed via the Cytoscape App Store ( http://apps.cytoscape.org/apps/chat). PMID:27853512

  1. Transcriptome Analysis Identifies Key Candidate Genes Mediating Purple Ovary Coloration in Asiatic Hybrid Lilies

    Science.gov (United States)

    Xu, Leifeng; Yang, Panpan; Yuan, Suxia; Feng, Yayan; Xu, Hua; Cao, Yuwei; Ming, Jun

    2016-01-01

    Lily tepals have a short lifespan. Once the tepals senesce, the ornamental value of the flower is lost. Some cultivars have attractive purple ovaries and fruits which greatly enhance the ornamental value of Asiatic hybrid lilies. However, little is known about the molecular mechanisms of anthocyanin biosynthesis in Asiatic hybrid lily ovaries. To investigate the transcriptional network that governs purple ovary coloration in Asiatic hybrid lilies, we obtained transcriptome data from green ovaries (S1) and purple ovaries (S2) of Asiatic “Tiny Padhye”. Comparative transcriptome analysis revealed 4228 differentially expressed genes. Differential expression analysis revealed that ten unigenes including four CHS genes, one CHI gene, one F3H gene, one F3′H gene, one DFR gene, one UFGT gene, and one 3RT gene were significantly up-regulated in purple ovaries. One MYB gene, LhMYB12-Lat, was identified as a key transcription factor determining the distribution of anthocyanins in Asiatic hybrid lily ovaries. Further qPCR results showed unigenes related to anthocyanin biosynthesis were highly expressed in purple ovaries of three purple-ovaried Asiatic hybrid lilies at stages 2 and 3, while they showed an extremely low level of expression in ovaries of three green-ovaried Asiatic hybrid lilies during all developmental stages. In addition, shading treatment significantly decreased pigment accumulation by suppressing the expression of several unigenes related to anthocyanin biosynthesis in ovaries of Asiatic “Tiny Padhye”. Lastly, a total of 15,048 Simple Sequence Repeats (SSRs) were identified in 13,710 sequences, and primer pairs for SSRs were designed. The results could further our understanding of the molecular mechanisms of anthocyanin biosynthesis in Asiatic hybrid lily ovaries. PMID:27879624

  2. Contextual Hub Analysis Tool (CHAT): A Cytoscape app for identifying contextually relevant hubs in biological networks.

    Science.gov (United States)

    Muetze, Tanja; Goenawan, Ivan H; Wiencko, Heather L; Bernal-Llinares, Manuel; Bryan, Kenneth; Lynn, David J

    2016-01-01

    Highly connected nodes (hubs) in biological networks are topologically important to the structure of the network and have also been shown to be preferentially associated with a range of phenotypes of interest. The relative importance of a hub node, however, can change depending on the biological context. Here, we report a Cytoscape app, the Contextual Hub Analysis Tool (CHAT), which enables users to easily construct and visualize a network of interactions from a gene or protein list of interest, integrate contextual information, such as gene expression or mass spectrometry data, and identify hub nodes that are more highly connected to contextual nodes (e.g. genes or proteins that are differentially expressed) than expected by chance. In a case study, we use CHAT to construct a network of genes that are differentially expressed in Dengue fever, a viral infection. CHAT was used to identify and compare contextual and degree-based hubs in this network. The top 20 degree-based hubs were enriched in pathways related to the cell cycle and cancer, which is likely due to the fact that proteins involved in these processes tend to be highly connected in general. In comparison, the top 20 contextual hubs were enriched in pathways commonly observed in a viral infection including pathways related to the immune response to viral infection. This analysis shows that such contextual hubs are considerably more biologically relevant than degree-based hubs and that analyses which rely on the identification of hubs solely based on their connectivity may be biased towards nodes that are highly connected in general rather than in the specific context of interest.

  3. Integrative omics analysis of rheumatoid arthritis identifies non-obvious therapeutic targets.

    Directory of Open Access Journals (Sweden)

    John W Whitaker

    Full Text Available Identifying novel therapeutic targets for the treatment of disease is challenging. To this end, we developed a genome-wide approach of candidate gene prioritization. We independently collocated sets of genes that were implicated in rheumatoid arthritis (RA pathogenicity through three genome-wide assays: (i genome-wide association studies (GWAS, (ii differentially expression in RA fibroblast-like synoviocytes (FLS, and (iii differentially methylation in RA FLS. Integrated analysis of these complementary data sets identified a significant enrichment of multi-evidence genes (MEGs within pathways relating to RA pathogenicity. One MEG is Engulfment and Cell Motility Protein-1 (ELMO1, a gene not previously considered as a therapeutic target in RA FLS. We demonstrated in RA FLS that ELMO1 is: (i expressed, (ii promotes cell migration and invasion, and (iii regulates Rac1 activity. Thus, we created links between ELMO1 and RA pathogenicity, which in turn validates ELMO1 as a potential RA therapeutic target. This study illustrated the power of MEG-based approaches for therapeutic target identification.

  4. Copy number analysis identifies novel interactions between genomic loci in ovarian cancer.

    Directory of Open Access Journals (Sweden)

    Kylie L Gorringe

    Full Text Available Ovarian cancer is a heterogeneous disease displaying complex genomic alterations, and consequently, it has been difficult to determine the most relevant copy number alterations with the scale of studies to date. We obtained genome-wide copy number alteration (CNA data from four different SNP array platforms, with a final data set of 398 ovarian tumours, mostly of the serous histological subtype. Frequent CNA aberrations targeted many thousands of genes. However, high-level amplicons and homozygous deletions enabled filtering of this list to the most relevant. The large data set enabled refinement of minimal regions and identification of rare amplicons such as at 1p34 and 20q11. We performed a novel co-occurrence analysis to assess cooperation and exclusivity of CNAs and analysed their relationship to patient outcome. Positive associations were identified between gains on 19 and 20q, gain of 20q and loss of X, and between several regions of loss, particularly 17q. We found weak correlations of CNA at genomic loci such as 19q12 with clinical outcome. We also assessed genomic instability measures and found a correlation of the number of higher amplitude gains with poorer overall survival. By assembling the largest collection of ovarian copy number data to date, we have been able to identify the most frequent aberrations and their interactions.

  5. Copy number analysis identifies novel interactions between genomic loci in ovarian cancer.

    Science.gov (United States)

    Gorringe, Kylie L; George, Joshy; Anglesio, Michael S; Ramakrishna, Manasa; Etemadmoghadam, Dariush; Cowin, Prue; Sridhar, Anita; Williams, Louise H; Boyle, Samantha E; Yanaihara, Nozomu; Okamoto, Aikou; Urashima, Mitsuyoshi; Smyth, Gordon K; Campbell, Ian G; Bowtell, David D L

    2010-09-10

    Ovarian cancer is a heterogeneous disease displaying complex genomic alterations, and consequently, it has been difficult to determine the most relevant copy number alterations with the scale of studies to date. We obtained genome-wide copy number alteration (CNA) data from four different SNP array platforms, with a final data set of 398 ovarian tumours, mostly of the serous histological subtype. Frequent CNA aberrations targeted many thousands of genes. However, high-level amplicons and homozygous deletions enabled filtering of this list to the most relevant. The large data set enabled refinement of minimal regions and identification of rare amplicons such as at 1p34 and 20q11. We performed a novel co-occurrence analysis to assess cooperation and exclusivity of CNAs and analysed their relationship to patient outcome. Positive associations were identified between gains on 19 and 20q, gain of 20q and loss of X, and between several regions of loss, particularly 17q. We found weak correlations of CNA at genomic loci such as 19q12 with clinical outcome. We also assessed genomic instability measures and found a correlation of the number of higher amplitude gains with poorer overall survival. By assembling the largest collection of ovarian copy number data to date, we have been able to identify the most frequent aberrations and their interactions.

  6. Deep Proteome Analysis Identifies Age-Related Processes in C. elegans.

    Science.gov (United States)

    Narayan, Vikram; Ly, Tony; Pourkarimi, Ehsan; Murillo, Alejandro Brenes; Gartner, Anton; Lamond, Angus I; Kenyon, Cynthia

    2016-08-01

    Effective network analysis of protein data requires high-quality proteomic datasets. Here, we report a near doubling in coverage of the C. elegans adult proteome, identifying >11,000 proteins in total with ∼9,400 proteins reproducibly detected in three biological replicates. Using quantitative mass spectrometry, we identify proteins whose abundances vary with age, revealing a concerted downregulation of proteins involved in specific metabolic pathways and upregulation of cellular stress responses with advancing age. Among these are ∼30 peroxisomal proteins, including the PRX-5/PEX5 import protein. Functional experiments confirm that protein import into the peroxisome is compromised in vivo in old animals. We also studied the behavior of the set of age-variant proteins in chronologically age-matched, long-lived daf-2 insulin/IGF-1-pathway mutants. Unexpectedly, the levels of many of these age-variant proteins did not scale with extended lifespan. This indicates that, despite their youthful appearance and extended lifespans, not all aspects of aging are reset in these long-lived mutants.

  7. Supervised multivariate analysis of sequence groups to identify specificity determining residues

    Directory of Open Access Journals (Sweden)

    Higgins Desmond G

    2007-04-01

    Full Text Available Abstract Background Proteins that evolve from a common ancestor can change functionality over time, and it is important to be able identify residues that cause this change. In this paper we show how a supervised multivariate statistical method, Between Group Analysis (BGA, can be used to identify these residues from families of proteins with different substrate specifities using multiple sequence alignments. Results We demonstrate the usefulness of this method on three different test cases. Two of these test cases, the Lactate/Malate dehydrogenase family and Nucleotidyl Cyclases, consist of two functional groups. The other family, Serine Proteases consists of three groups. BGA was used to analyse and visualise these three families using two different encoding schemes for the amino acids. Conclusion This overall combination of methods in this paper is powerful and flexible while being computationally very fast and simple. BGA is especially useful because it can be used to analyse any number of functional classes. In the examples we used in this paper, we have only used 2 or 3 classes for demonstration purposes but any number can be used and visualised.

  8. Phylogenetic analysis of rubella viruses identified in Uganda, 2003-2012.

    Science.gov (United States)

    Namuwulya, Prossy; Abernathy, Emily; Bukenya, Henry; Bwogi, Josephine; Tushabe, Phionah; Birungi, Molly; Seguya, Ronald; Kabaliisa, Theopista; Alibu, Vincent P; Kayondo, Jonathan K; Rivailler, Pierre; Icenogle, Joseph; Bakamutumaho, Barnabas

    2014-12-01

    Molecular data on rubella viruses are limited in Uganda despite the importance of congenital rubella syndrome (CRS). Routine rubella vaccination, while not administered currently in Uganda, is expected to begin by 2015. The World Health Organization recommends that countries without rubella vaccination programs assess the burden of rubella and CRS before starting a routine vaccination program. Uganda is already involved in integrated case-based surveillance, including laboratory testing to confirm measles and rubella, but molecular epidemiologic aspects of rubella circulation have so far not been documented in Uganda. Twenty throat swab or oral fluid samples collected from 12 districts during routine rash and fever surveillance between 2003 and 2012 were identified as rubella virus RNA positive and PCR products encompassing the region used for genotyping were sequenced. Phylogenetic analysis of the 20 sequences identified 19 genotype 1G viruses and 1 genotype 1E virus. Genotype-specific trees showed that the Uganda viruses belonged to specific clusters for both genotypes 1G and 1E and grouped with similar sequences from neighboring countries. Genotype 1G was predominant in Uganda. More epidemiological and molecular epidemiological data are required to determine if genotype 1E is also endemic in Uganda. The information obtained in this study will assist the immunization program in monitoring changes in circulating genotypes.

  9. Analysis of Pigeon (Columba) Ovary Transcriptomes to Identify Genes Involved in Blue Light Regulation.

    Science.gov (United States)

    Wang, Ying; Ding, Jia-Tong; Yang, Hai-Ming; Yan, Zheng-Jie; Cao, Wei; Li, Yang-Bai

    2015-01-01

    Monochromatic light is widely applied to promote poultry reproductive performance, yet little is currently known regarding the mechanism by which light wavelengths affect pigeon reproduction. Recently, high-throughput sequencing technologies have been used to provide genomic information for solving this problem. In this study, we employed Illumina Hiseq 2000 to identify differentially expressed genes in ovary tissue from pigeons under blue and white light conditions and de novo transcriptome assembly to construct a comprehensive sequence database containing information on the mechanisms of follicle development. A total of 157,774 unigenes (mean length: 790 bp) were obtained by the Trinity program, and 35.83% of these unigenes were matched to genes in a non-redundant protein database. Gene description, gene ontology, and the clustering of orthologous group terms were performed to annotate the transcriptome assembly. Differentially expressed genes between blue and white light conditions included those related to oocyte maturation, hormone biosynthesis, and circadian rhythm. Furthermore, 17,574 SSRs and 533,887 potential SNPs were identified in this transcriptome assembly. This work is the first transcriptome analysis of the Columba ovary using Illumina technology, and the resulting transcriptome and differentially expressed gene data can facilitate further investigations into the molecular mechanism of the effect of blue light on follicle development and reproduction in pigeons and other bird species.

  10. Analysis of Pigeon (Columba Ovary Transcriptomes to Identify Genes Involved in Blue Light Regulation.

    Directory of Open Access Journals (Sweden)

    Ying Wang

    Full Text Available Monochromatic light is widely applied to promote poultry reproductive performance, yet little is currently known regarding the mechanism by which light wavelengths affect pigeon reproduction. Recently, high-throughput sequencing technologies have been used to provide genomic information for solving this problem. In this study, we employed Illumina Hiseq 2000 to identify differentially expressed genes in ovary tissue from pigeons under blue and white light conditions and de novo transcriptome assembly to construct a comprehensive sequence database containing information on the mechanisms of follicle development. A total of 157,774 unigenes (mean length: 790 bp were obtained by the Trinity program, and 35.83% of these unigenes were matched to genes in a non-redundant protein database. Gene description, gene ontology, and the clustering of orthologous group terms were performed to annotate the transcriptome assembly. Differentially expressed genes between blue and white light conditions included those related to oocyte maturation, hormone biosynthesis, and circadian rhythm. Furthermore, 17,574 SSRs and 533,887 potential SNPs were identified in this transcriptome assembly. This work is the first transcriptome analysis of the Columba ovary using Illumina technology, and the resulting transcriptome and differentially expressed gene data can facilitate further investigations into the molecular mechanism of the effect of blue light on follicle development and reproduction in pigeons and other bird species.

  11. Multivariation calibration techniques applied to NIRA (near infrared reflectance analysis) and FTIR (Fourier transform infrared) data

    Science.gov (United States)

    Long, C. L.

    1991-02-01

    Multivariate calibration techniques can reduce the time required for routine testing and can provide new methods of analysis. Multivariate calibration is commonly used with near infrared reflectance analysis (NIRA) and Fourier transform infrared (FTIR) spectroscopy. Two feasibility studies were performed to determine the capability of NIRA, using multivariate calibration techniques, to perform analyses on the types of samples that are routinely analyzed at this laboratory. The first study performed included a variety of samples and indicated that NIRA would be well-suited to perform analyses on selected materials properties such as water content and hydroxyl number on polyol samples, epoxy content on epoxy resins, water content of desiccants, and the amine values of various amine cure agents. A second study was performed to assess the capability of NIRA to perform quantitative analysis of hydroxyl numbers and water contents of hydroxyl-containing materials. Hydroxyl number and water content were selected for determination because these tests are frequently run on polyol materials and the hydroxyl number determination is time consuming. This study pointed out the necessity of obtaining calibration standards identical to the samples being analyzed for each type of polyol or other material being analyzed. Multivariate calibration techniques are frequently used with FTIR data to determine the composition of a large variety of complex mixtures. A literature search indicated many applications of multivariate calibration to FTIR data. Areas identified where quantitation by FTIR would provide a new capability are quantitation of components in epoxy and silicone resins, polychlorinated biphenyls (PCBs) in oils, and additives to polymers.

  12. Tape Stripping Technique for Stratum Corneum Protein Analysis

    DEFF Research Database (Denmark)

    Clausen, Maja-Lisa; Slotved, H-C; Krogfelt, Karen A

    2016-01-01

    The aim of this study was to investigate the amount of protein in stratum corneum in atopic dermatitis (AD) patients and healthy controls, using tape stripping technique. Furthermore, to compare two different methods for protein assessment. Tape stripping was performed in AD patients and healthy...

  13. Analysis on Poe's Unique Techniques to Achieve Aestheticism

    Institute of Scientific and Technical Information of China (English)

    孔佳鸣

    2008-01-01

    Edgar Allan Poe was one of the most important poets in the American poetic history for his unremitting pursuit for ‘ideal beauty'.This essay proves by various examples chosen from his poems that his aestheticism was obvious in his versification techniques.His poetic theory and practice gave an immortal example for the development of the English poetry.

  14. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente;

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four aque...

  15. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  16. Integrative Functional Genomics Analysis of Sustained Polyploidy Phenotypes in Breast Cancer Cells Identifies an Oncogenic Profile for GINS2

    Directory of Open Access Journals (Sweden)

    Juha K. Rantala

    2010-11-01

    Full Text Available Aneuploidy is among the most obvious differences between normal and cancer cells. However, mechanisms contributing to development and maintenance of aneuploid cell growth are diverse and incompletely understood. Functional genomics analyses have shown that aneuploidy in cancer cells is correlated with diffuse gene expression signatures and aneuploidy can arise by a variety of mechanisms, including cytokinesis failures, DNA endoreplication, and possibly through polyploid intermediate states. To identify molecular processes contributing to development of aneuploidy, we used a cell spot microarray technique to identify genes inducing polyploidy and/or allowing maintenance of polyploid cell growth in breast cancer cells. Of 5760 human genes screened, 177 were found to induce severe DNA content alterations on prolonged transient silencing. Association with response to DNA damage stimulus and DNA repair was found to be the most enriched cellular processes among the candidate genes. Functional validation analysis of these genes highlighted GINS2 as the highest ranking candidate inducing polyploidy, accumulation of endogenous DNA damage, and impairing cell proliferation on inhibition. The cell growth inhibition and induction of polyploidy by suppression of GINS2 was verified in a panel of breast cancer cell lines. Bioinformatic analysis of published gene expression and DNA copy number studies of clinical breast tumors suggested GINS2 to be associated with the aggressive characteristics of a subgroup of breast cancers in vivo. In addition, nuclear GINS2 protein levels distinguished actively proliferating cancer cells suggesting potential use of GINS2 staining as a biomarker of cell proliferation as well as a potential therapeutic target.

  17. Comparative Transcriptome Analysis Identifies CCDC80 as a Novel Gene Associated with Pulmonary Arterial Hypertension

    Directory of Open Access Journals (Sweden)

    Yuhei eNishimura

    2016-06-01

    Full Text Available Pulmonary arterial hypertension (PAH is a heterogeneous disorder associated with a progressive increase in pulmonary artery resistance and pressure. Although various therapies have been developed, the 5-year survival rate of PAH patients remains low. There is thus an important need to identify novel genes that are commonly dysregulated in PAH of various etiologies and could be used as biomarkers and/or therapeutic targets. In this study, we performed comparative transcriptome analysis of five mammalian PAH datasets downloaded from a public database. We identified 228 differentially expressed genes (DEGs from a rat PAH model caused by inhibition of vascular endothelial growth factor receptor under hypoxic conditions, 379 DEGs from a mouse PAH model associated with systemic sclerosis, 850 DEGs from a mouse PAH model associated with schistosomiasis, 1598 DEGs from one cohort of human PAH patients, and 4260 DEGs from a second cohort of human PAH patients. Gene-by-gene comparison identified four genes that were differentially upregulated or downregulated in parallel in all five sets of DEGs. Expression of coiled-coil domain containing 80 (CCDC80 and anterior gradient 2 genes was significantly increased in the five datasets, whereas expression of SMAD family member 6 and granzyme A was significantly decreased. Weighted gene co-expression network analysis revealed a connection between CCDC80 and collagen type I alpha 1 (COL1A1 expression. To validate the function of CCDC80 in vivo, we knocked out ccdc80 in zebrafish using the clustered regularly interspaced short palindromic repeats (CRISPR/Cas9 system. In vivo imaging of zebrafish expressing a fluorescent protein in endothelial cells showed that ccdc80 deletion significantly increased the diameter of the ventral artery, a vessel supplying blood to the gills. We also demonstrated that expression of col1a1 and endothelin-1 mRNA was significantly decreased in the ccdc80-knockout zebrafish. Finally, we

  18. Identifying patterns in treatment response profiles in acute bipolar mania: a cluster analysis approach

    Directory of Open Access Journals (Sweden)

    Houston John P

    2008-07-01

    Full Text Available Abstract Background Patients with acute mania respond differentially to treatment and, in many cases, fail to obtain or sustain symptom remission. The objective of this exploratory analysis was to characterize response in bipolar disorder by identifying groups of patients with similar manic symptom response profiles. Methods Patients (n = 222 were selected from a randomized, double-blind study of treatment with olanzapine or divalproex in bipolar I disorder, manic or mixed episode, with or without psychotic features. Hierarchical clustering based on Ward's distance was used to identify groups of patients based on Young-Mania Rating Scale (YMRS total scores at each of 5 assessments over 7 weeks. Logistic regression was used to identify baseline predictors for clusters of interest. Results Four distinct clusters of patients were identified: Cluster 1 (n = 64: patients did not maintain a response (YMRS total scores ≤ 12; Cluster 2 (n = 92: patients responded rapidly (within less than a week and response was maintained; Cluster 3 (n = 36: patients responded rapidly but relapsed soon afterwards (YMRS ≥ 15; Cluster 4 (n = 30: patients responded slowly (≥ 2 weeks and response was maintained. Predictive models using baseline variables found YMRS Item 10 (Appearance, and psychosis to be significant predictors for Clusters 1 and 4 vs. Clusters 2 and 3, but none of the baseline characteristics allowed discriminating between Clusters 1 vs. 4. Experiencing a mixed episode at baseline predicted membership in Clusters 2 and 3 vs. Clusters 1 and 4. Treatment with divalproex, larger number of previous manic episodes, lack of disruptive-aggressive behavior, and more prominent depressive symptoms at baseline were predictors for Cluster 3 vs. 2. Conclusion Distinct treatment response profiles can be predicted by clinical features at baseline. The presence of these features as potential risk factors for relapse in patients who have responded to treatment

  19. Functional analysis of TPM domain containing Rv2345 of Mycobacterium tuberculosis identifies its phosphatase activity.

    Science.gov (United States)

    Sinha, Avni; Eniyan, Kandasamy; Sinha, Swati; Lynn, Andrew Michael; Bajpai, Urmi

    2015-07-01

    Mycobacterium tuberculosis (Mtb) is the causal agent of tuberculosis, the second largest infectious disease. With the rise of multi-drug resistant strains of M. tuberculosis, serious challenge lies ahead of us in treating the disease. The availability of complete genome sequence of Mtb has improved the scope for identifying new proteins that would not only further our understanding of biology of the organism but could also serve to discover new drug targets. In this study, Rv2345, a hypothetical membrane protein of M. tuberculosis H37Rv, which is reported to be a putative ortholog of ZipA cell division protein has been assigned function through functional annotation using bioinformatics tools followed by experimental validation. Sequence analysis showed Rv2345 to have a TPM domain at its N-terminal region and predicted it to have phosphatase activity. The TPM domain containing region of Rv2345 was cloned and expressed using pET28a vector in Escherichia coli and purified by Nickel affinity chromatography. The purified TPM domain was tested in vitro and our results confirmed it to have phosphatase activity. The enzyme activity was first checked and optimized with pNPP as substrate, followed by using ATP, which was also found to be used as substrate by the purified protein. Hence sequence analysis followed by in vitro studies characterizes TPM domain of Rv2345 to contain phosphatase activity.

  20. Application of (13)C flux analysis to identify high-productivity CHO metabolic phenotypes.

    Science.gov (United States)

    Templeton, Neil; Smith, Kevin D; McAtee-Pereira, Allison G; Dorai, Haimanti; Betenbaugh, Michael J; Lang, Steven E; Young, Jamey D

    2017-01-23

    Industrial bioprocesses place high demands on the energy metabolism of host cells to meet biosynthetic requirements for maximal protein expression. Identifying metabolic phenotypes that promote high expression is therefore a major goal of the biotech industry. We conducted a series of (13)C flux analysis studies to examine the metabolic response to IgG expression during early stationary phase of CHO cell cultures grown in 3L fed-batch bioreactors. We examined eight clones expressing four different IgGs and compared with three non-expressing host-cell controls. Some clones were genetically manipulated to be apoptosis-resistant by expressing Bcl-2Δ, which correlated with increased IgG production and elevated glucose metabolism. The metabolic phenotypes of the non-expressing, IgG-expressing, and Bcl-2Δ/IgG-expressing clones were fully segregated by hierarchical clustering analysis. Lactate consumption and citric acid cycle fluxes were most strongly associated with specific IgG productivity. These studies indicate that enhanced oxidative metabolism is a characteristic of high-producing CHO cell lines.

  1. Genomic reprograming analysis of the Mesothelial to Mesenchymal Transition identifies biomarkers in peritoneal dialysis patients

    Science.gov (United States)

    Ruiz-Carpio, Vicente; Sandoval, Pilar; Aguilera, Abelardo; Albar-Vizcaíno, Patricia; Perez-Lozano, María Luisa; González-Mateo, Guadalupe T.; Acuña-Ruiz, Adrián; García-Cantalejo, Jesús; Botías, Pedro; Bajo, María Auxiliadora; Selgas, Rafael; Sánchez-Tomero, José Antonio; Passlick-Deetjen, Jutta; Piecha, Dorothea; Büchel, Janine; Steppan, Sonja; López-Cabrera, Manuel

    2017-01-01

    Peritoneal dialysis (PD) is an effective renal replacement therapy, but a significant proportion of patients suffer PD-related complications, which limit the treatment duration. Mesothelial-to-mesenchymal transition (MMT) contributes to the PD-related peritoneal dysfunction. We analyzed the genetic reprograming of MMT to identify new biomarkers that may be tested in PD-patients. Microarray analysis revealed a partial overlapping between MMT induced in vitro and ex vivo in effluent-derived mesothelial cells, and that MMT is mainly a repression process being higher the number of genes that are down-regulated than those that are induced. Cellular morphology and number of altered genes showed that MMT ex vivo could be subdivided into two stages: early/epithelioid and advanced/non-epithelioid. RT-PCR array analysis demonstrated that a number of genes differentially expressed in effluent-derived non-epithelioid cells also showed significant differential expression when comparing standard versus low-GDP PD fluids. Thrombospondin-1 (TSP1), collagen-13 (COL13), vascular endothelial growth factor A (VEGFA), and gremlin-1 (GREM1) were measured in PD effluents, and except GREM1, showed significant differences between early and advanced stages of MMT, and their expression was associated with a high peritoneal transport status. The results establish a proof of concept about the feasibility of measuring MMT-associated secreted protein levels as potential biomarkers in PD. PMID:28327551

  2. Five endometrial cancer risk loci identified through genome-wide association analysis

    Science.gov (United States)

    O’Mara, Tracy A; Painter, Jodie N; Glubb, Dylan M; Flach, Susanne; Lewis, Annabelle; French, Juliet D; Freeman-Mills, Luke; Church, David; Gorman, Maggie; Martin, Lynn; Hodgson, Shirley; Webb, Penelope M; Attia, John; Holliday, Elizabeth G; McEvoy, Mark; Scott, Rodney J; Henders, Anjali K; Martin, Nicholas G; Montgomery, Grant W; Nyholt, Dale R; Ahmed, Shahana; Healey, Catherine S; Shah, Mitul; Dennis, Joe; Fasching, Peter A; Beckmann, Matthias W; Hein, Alexander; Ekici, Arif B; Hall, Per; Czene, Kamila; Darabi, Hatef; Li, Jingmei; Dörk, Thilo; Dürst, Matthias; Hillemanns, Peter; Runnebaum, Ingo; Amant, Frederic; Schrauwen, Stefanie; Zhao, Hui; Lambrechts, Diether; Depreeuw, Jeroen; Dowdy, Sean C; Goode, Ellen L; Fridley, Brooke L; Winham, Stacey J; Njølstad, Tormund S; Salvesen, Helga B; Trovik, Jone; Werner, Henrica MJ; Ashton, Katie; Otton, Geoffrey; Proietto, Tony; Liu, Tao; Mints, Miriam; Tham, Emma; Consortium, CHIBCHA; Jun Li, Mulin; Yip, Shun H; Wang, Junwen; Bolla, Manjeet K; Michailidou, Kyriaki; Wang, Qin; Tyrer, Jonathan P; Dunlop, Malcolm; Houlston, Richard; Palles, Claire; Hopper, John L; Peto, Julian; Swerdlow, Anthony J; Burwinkel, Barbara; Brenner, Hermann; Meindl, Alfons; Brauch, Hiltrud; Lindblom, Annika; Chang-Claude, Jenny; Couch, Fergus J; Giles, Graham G; Kristensen, Vessela N; Cox, Angela; Cunningham, Julie M; Pharoah, Paul D P; Dunning, Alison M; Edwards, Stacey L; Easton, Douglas F; Tomlinson, Ian; Spurdle, Amanda B

    2016-01-01

    We conducted a meta-analysis of three endometrial cancer GWAS and two replication phases totaling 7,737 endometrial cancer cases and 37,144 controls of European ancestry. Genome-wide imputation and meta-analysis identified five novel risk loci of genome-wide significance at likely regulatory regions on chromosomes 13q22.1 (rs11841589, near KLF5), 6q22.31 (rs13328298, in LOC643623 and near HEY2 and NCOA7), 8q24.21 (rs4733613, telomeric to MYC), 15q15.1 (rs937213, in EIF2AK4, near BMF) and 14q32.33 (rs2498796, in AKT1 near SIVA1). A second independent 8q24.21 signal (rs17232730) was found. Functional studies of the 13q22.1 locus showed that rs9600103 (pairwise r2=0.98 with rs11841589) is located in a region of active chromatin that interacts with the KLF5 promoter region. The rs9600103-T endometrial cancer protective allele suppressed gene expression in vitro suggesting that regulation of KLF5 expression, a gene linked to uterine development, is implicated in tumorigenesis. These findings provide enhanced insight into the genetic and biological basis of endometrial cancer. PMID:27135401

  3. Five endometrial cancer risk loci identified through genome-wide association analysis.

    Science.gov (United States)

    Cheng, Timothy H T; Thompson, Deborah J; O'Mara, Tracy A; Painter, Jodie N; Glubb, Dylan M; Flach, Susanne; Lewis, Annabelle; French, Juliet D; Freeman-Mills, Luke; Church, David; Gorman, Maggie; Martin, Lynn; Hodgson, Shirley; Webb, Penelope M; Attia, John; Holliday, Elizabeth G; McEvoy, Mark; Scott, Rodney J; Henders, Anjali K; Martin, Nicholas G; Montgomery, Grant W; Nyholt, Dale R; Ahmed, Shahana; Healey, Catherine S; Shah, Mitul; Dennis, Joe; Fasching, Peter A; Beckmann, Matthias W; Hein, Alexander; Ekici, Arif B; Hall, Per; Czene, Kamila; Darabi, Hatef; Li, Jingmei; Dörk, Thilo; Dürst, Matthias; Hillemanns, Peter; Runnebaum, Ingo; Amant, Frederic; Schrauwen, Stefanie; Zhao, Hui; Lambrechts, Diether; Depreeuw, Jeroen; Dowdy, Sean C; Goode, Ellen L; Fridley, Brooke L; Winham, Stacey J; Njølstad, Tormund S; Salvesen, Helga B; Trovik, Jone; Werner, Henrica M J; Ashton, Katie; Otton, Geoffrey; Proietto, Tony; Liu, Tao; Mints, Miriam; Tham, Emma; Li, Mulin Jun; Yip, Shun H; Wang, Junwen; Bolla, Manjeet K; Michailidou, Kyriaki; Wang, Qin; Tyrer, Jonathan P; Dunlop, Malcolm; Houlston, Richard; Palles, Claire; Hopper, John L; Peto, Julian; Swerdlow, Anthony J; Burwinkel, Barbara; Brenner, Hermann; Meindl, Alfons; Brauch, Hiltrud; Lindblom, Annika; Chang-Claude, Jenny; Couch, Fergus J; Giles, Graham G; Kristensen, Vessela N; Cox, Angela; Cunningham, Julie M; Pharoah, Paul D P; Dunning, Alison M; Edwards, Stacey L; Easton, Douglas F; Tomlinson, Ian; Spurdle, Amanda B

    2016-06-01

    We conducted a meta-analysis of three endometrial cancer genome-wide association studies (GWAS) and two follow-up phases totaling 7,737 endometrial cancer cases and 37,144 controls of European ancestry. Genome-wide imputation and meta-analysis identified five new risk loci of genome-wide significance at likely regulatory regions on chromosomes 13q22.1 (rs11841589, near KLF5), 6q22.31 (rs13328298, in LOC643623 and near HEY2 and NCOA7), 8q24.21 (rs4733613, telomeric to MYC), 15q15.1 (rs937213, in EIF2AK4, near BMF) and 14q32.33 (rs2498796, in AKT1, near SIVA1). We also found a second independent 8q24.21 signal (rs17232730). Functional studies of the 13q22.1 locus showed that rs9600103 (pairwise r(2) = 0.98 with rs11841589) is located in a region of active chromatin that interacts with the KLF5 promoter region. The rs9600103[T] allele that is protective in endometrial cancer suppressed gene expression in vitro, suggesting that regulation of the expression of KLF5, a gene linked to uterine development, is implicated in tumorigenesis. These findings provide enhanced insight into the genetic and biological basis of endometrial cancer.

  4. Functional gene group analysis identifies synaptic gene groups as risk factor for schizophrenia.

    Science.gov (United States)

    Lips, E S; Cornelisse, L N; Toonen, R F; Min, J L; Hultman, C M; Holmans, P A; O'Donovan, M C; Purcell, S M; Smit, A B; Verhage, M; Sullivan, P F; Visscher, P M; Posthuma, D

    2012-10-01

    Schizophrenia is a highly heritable disorder with a polygenic pattern of inheritance and a population prevalence of ~1%. Previous studies have implicated synaptic dysfunction in schizophrenia. We tested the accumulated association of genetic variants in expert-curated synaptic gene groups with schizophrenia in 4673 cases and 4965 healthy controls, using functional gene group analysis. Identifying groups of genes with similar cellular function rather than genes in isolation may have clinical implications for finding additional drug targets. We found that a group of 1026 synaptic genes was significantly associated with the risk of schizophrenia (P=7.6 × 10(-11)) and more strongly associated than 100 randomly drawn, matched control groups of genetic variants (P<0.01). Subsequent analysis of synaptic subgroups suggested that the strongest association signals are derived from three synaptic gene groups: intracellular signal transduction (P=2.0 × 10(-4)), excitability (P=9.0 × 10(-4)) and cell adhesion and trans-synaptic signaling (P=2.4 × 10(-3)). These results are consistent with a role of synaptic dysfunction in schizophrenia and imply that impaired intracellular signal transduction in synapses, synaptic excitability and cell adhesion and trans-synaptic signaling play a role in the pathology of schizophrenia.

  5. Genetic modifier loci of mouse Mfrp(rd6) identified by quantitative trait locus analysis.

    Science.gov (United States)

    Won, Jungyeon; Charette, Jeremy R; Philip, Vivek M; Stearns, Timothy M; Zhang, Weidong; Naggert, Jürgen K; Krebs, Mark P; Nishina, Patsy M

    2014-01-01

    The identification of genes that modify pathological ocular phenotypes in mouse models may improve our understanding of disease mechanisms and lead to new treatment strategies. Here, we identify modifier loci affecting photoreceptor cell loss in homozygous Mfrp(rd6) mice, which exhibit a slowly progressive photoreceptor degeneration. A cohort of 63 F2 homozygous Mfrp(rd6) mice from a (B6.C3Ga-Mfrp(rd6)/J × CAST/EiJ) F1 intercross exhibited a variable number of cell bodies in the retinal outer nuclear layer at 20 weeks of age. Mice were genotyped with a panel of single nucleotide polymorphism markers, and genotypes were correlated with phenotype by quantitative trait locus (QTL) analysis to map modifier loci. A genome-wide scan revealed a statistically significant, protective candidate locus on CAST/EiJ Chromosome 1 and suggestive modifier loci on Chromosomes 6 and 11. Multiple regression analysis of a three-QTL model indicated that the modifier loci on Chromosomes 1 and 6 together account for 26% of the observed phenotypic variation, while the modifier locus on Chromosome 11 explains only an additional 4%. Our findings indicate that the severity of the Mfrp(rd6) retinal degenerative phenotype in mice depends on the strain genetic background and that a significant modifier locus on CAST/EiJ Chromosome 1 protects against Mfrp(rd6)-associated photoreceptor loss.

  6. Identifying differences in the experience of (in)authenticity: a latent class analysis approach.

    Science.gov (United States)

    Lenton, Alison P; Slabu, Letitia; Bruder, Martin; Sedikides, Constantine

    2014-01-01

    Generally, psychologists consider state authenticity - that is, the subjective sense of being one's true self - to be a unitary and unidimensional construct, such that (a) the phenomenological experience of authenticity is thought to be similar no matter its trigger, and (b) inauthenticity is thought to be simply the opposing pole (on the same underlying construct) of authenticity. Using latent class analysis, we put this conceptualization to a test. In order to avoid over-reliance on a Western conceptualization of authenticity, we used a cross-cultural sample (N = 543), comprising participants from Western, South-Asian, East-Asian, and South-East Asian cultures. Participants provided either a narrative in which the described when they felt most like being themselves or one in which they described when they felt least like being themselves. The analysis identified six distinct classes of experiences: two authenticity classes ("everyday" and "extraordinary"), three inauthenticity classes ("self-conscious," "deflated," and "extraordinary"), and a class representing convergence between authenticity and inauthenticity. The classes were phenomenologically distinct, especially with respect to negative affect, private and public self-consciousness, and self-esteem. Furthermore, relatively more interdependent cultures were less likely to report experiences of extraordinary (in)authenticity than relatively more independent cultures. Understanding the many facets of (in)authenticity may enable researchers to connect different findings and explain why the attainment of authenticity can be difficult.

  7. Transcriptome bioinformatic analysis identifies potential therapeutic mechanism of pentylenetetrazole in down syndrome

    Directory of Open Access Journals (Sweden)

    Sharma Abhay

    2010-10-01

    Full Text Available Abstract Background Pentylenetetrazole (PTZ has recently been found to ameliorate cognitive impairment in rodent models of Down syndrome (DS. The mechanism underlying PTZ's therapeutic effect in DS is however not clear. Microarray profiling has previously reported differential expression, both up- and down-regulation, of genes in DS. Given this, transcriptomic data related to PTZ treatment, if available, could be used to understand the drug's therapeutic mechanism in DS. No such mammalian data however exists. Nevertheless, a Drosophila model inspired by PTZ induced kindling plasticity in rodents has recently been described. Microarray profiling has shown PTZ's downregulatory effect on gene expression in the fly heads. Methods In a comparative transcriptomics approach, I have analyzed the available microarray data in order to identify potential therapeutic mechanism of PTZ in DS. In the analysis, summary data of up- and down-regulated genes reported in human DS studies and of down-regulated genes reported in the Drosophila model has been used. Results I find that transcriptomic correlate of chronic PTZ in Drosophila counteracts that of DS. Genes downregulated by PTZ significantly over-represent genes upregulated in DS and under-represent genes downregulated in DS. Further, the genes which are common in the downregulated and upregulated DS set show enrichment for MAP kinase pathway. Conclusion My analysis suggests that downregulation of MAP kinase pathway may mediate therapeutic effect of PTZ in DS. Existing evidence implicating MAP kinase pathway in DS supports this observation.

  8. Transcriptome analysis of Solanum melongena L. (eggplant) fruit to identify putative allergens and their epitopes.

    Science.gov (United States)

    Ramesh, Kumar Ramagoni; Hemalatha, R; Vijayendra, Chary Anchoju; Arshi, Uz Zaman Syed; Dushyant, Singh Baghel; Dinesh, Kumar Bharadwaj

    2016-01-15

    Eggplant is the third most important Solanaceae crop after tomato and potato, particularly in India and China. A transcriptome analysis of eggplant's fruit was performed to study genes involved in medicinal importance and allergies. Illumina HiSeq 2000 system generated 89,763,638 raw reads (~18 Gb) from eggplant. High quality reads (59,039,694) obtained after trimming process, were assembled into a total of 149,224 non redundant set of transcripts. Out of 80,482 annotated sequences of eggplant fruit (BLASTx results against nr-green plant database), 40,752 transcripts showed significant similarity with predicted proteins of Solanum tuberosum (51%) followed by Solanum lycopersicum (34%) and other sequenced plant genomes. With BLASTx top hit analysis against existing allergens, a total of 1986 homologous allergen sequences were found, which had >37% similarity with 48 different allergens existing in the database. From the 48 putative allergens, 526 B-cell linear epitopes were identified using BepiPred linear epitope prediction tool. Transcript sequences generated from this study can be used to map epitopes of monoclonal antibodies and polyclonal sera from patients. With the support of this whole transcriptome catalogue of eggplant fruit, complete list of genes can be predicted based on which secondary structures of proteins may be modeled.

  9. Genome-wide meta-analysis identifies multiple novel associations and ethnic heterogeneity of psoriasis susceptibility.

    Science.gov (United States)

    Yin, Xianyong; Low, Hui Qi; Wang, Ling; Li, Yonghong; Ellinghaus, Eva; Han, Jiali; Estivill, Xavier; Sun, Liangdan; Zuo, Xianbo; Shen, Changbing; Zhu, Caihong; Zhang, Anping; Sanchez, Fabio; Padyukov, Leonid; Catanese, Joseph J; Krueger, Gerald G; Duffin, Kristina Callis; Mucha, Sören; Weichenthal, Michael; Weidinger, Stephan; Lieb, Wolfgang; Foo, Jia Nee; Li, Yi; Sim, Karseng; Liany, Herty; Irwan, Ishak; Teo, Yikying; Theng, Colin T S; Gupta, Rashmi; Bowcock, Anne; De Jager, Philip L; Qureshi, Abrar A; de Bakker, Paul I W; Seielstad, Mark; Liao, Wilson; Ståhle, Mona; Franke, Andre; Zhang, Xuejun; Liu, Jianjun

    2015-04-23

    Psoriasis is a common inflammatory skin disease with complex genetics and different degrees of prevalence across ethnic populations. Here we present the largest trans-ethnic genome-wide meta-analysis (GWMA) of psoriasis in 15,369 cases and 19,517 controls of Caucasian and Chinese ancestries. We identify four novel associations at LOC144817, COG6, RUNX1 and TP63, as well as three novel secondary associations within IFIH1 and IL12B. Fine-mapping analysis of MHC region demonstrates an important role for all three HLA class I genes and a complex and heterogeneous pattern of HLA associations between Caucasian and Chinese populations. Further, trans-ethnic comparison suggests population-specific effect or allelic heterogeneity for 11 loci. These population-specific effects contribute significantly to the ethnic diversity of psoriasis prevalence. This study not only provides novel biological insights into the involvement of immune and keratinocyte development mechanism, but also demonstrates a complex and heterogeneous genetic architecture of psoriasis susceptibility across ethnic populations.

  10. Multi-tissue microarray analysis identifies a molecular signature of regeneration.

    Directory of Open Access Journals (Sweden)

    Sarah E Mercer

    Full Text Available The inability to functionally repair tissues that are lost as a consequence of disease or injury remains a significant challenge for regenerative medicine. The molecular and cellular processes involved in complete restoration of tissue architecture and function are expected to be complex and remain largely unknown. Unlike humans, certain salamanders can completely regenerate injured tissues and lost appendages without scar formation. A parsimonious hypothesis would predict that all of these regenerative activities are regulated, at least in part, by a common set of genes. To test this hypothesis and identify genes that might control conserved regenerative processes, we performed a comprehensive microarray analysis of the early regenerative response in five regeneration-competent tissues from the newt Notophthalmus viridescens. Consistent with this hypothesis, we established a molecular signature for regeneration that consists of common genes or gene family members that exhibit dynamic differential regulation during regeneration in multiple tissue types. These genes include members of the matrix metalloproteinase family and its regulators, extracellular matrix components, genes involved in controlling cytoskeleton dynamics, and a variety of immune response factors. Gene Ontology term enrichment analysis validated and supported their functional activities in conserved regenerative processes. Surprisingly, dendrogram clustering and RadViz classification also revealed that each regenerative tissue had its own unique temporal expression profile, pointing to an inherent tissue-specific regenerative gene program. These new findings demand a reconsideration of how we conceptualize regenerative processes and how we devise new strategies for regenerative medicine.

  11. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    Science.gov (United States)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  12. A System of Systems Interface Hazard Analysis Technique

    Science.gov (United States)

    2007-03-01

    Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...steps are applicable for a software HAZOP . 2 Plan HAZOP Establish HAZOP analysis goals, definitions, worksheets, schedule and process. Divide the...Subtle Incorrect Output’s value is wrong, but cannot be detected Table 3. HAZOP Guide Words for Software or System Interface Analysis31 The

  13. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  14. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  15. A meta-analysis to identify animal and management factors influencing gestating sow efficiency.

    Science.gov (United States)

    Douglas, S L; Szyszka, O; Stoddart, K; Edwards, S A; Kyriazakis, I

    2014-12-01

    A meta-analysis on the effects of management and animal-based factors on the reproductive efficiency of gestating sows can provide information on single-factor and interaction effects that may not have been detected in individual studies. This study analyzed the effects of such factors on the number of piglets born alive per litter (BA), piglet birth weight (BiW) and weaning weight (WW), and number of piglets born alive per kilogram of sow feed intake during gestation (BA/FI). A total of 51 papers and 7 data sources were identified for the meta-analysis, out of which 23 papers and 5 sets of production data were useable (a total of 121 treatments). The information gathered included the dependent variables as well as information regarding animal, management, and feed characteristics. While a number of factors were individually significant, the multivariate models identified significant effects only of 1) floor type (P=0.003), sow BW at the end of gestation (P=0.002), and housing (stalls vs. loose; P=0.004) on BA; as floor type and housing were confounded, they were included in 2 separate models. The BA was higher on solid (12.1) in comparison to partly slatted (11.4) and fully slatted floors (10.2); 2) sow gestation environment (P=0.017) and gestation feed allowance (P=0.046) on BiW, with BiW of pigs higher for sows kept outdoors rather than indoors (1.75 versus 1.49 kg); 3) parity number (P=0.003) and feed intake during gestation (P=0.017) on WW; in addition there w