WorldWideScience

Sample records for analysis techniques identifies

  1. Evaluation of energy system analysis techniques for identifying underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C. [and others

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  2. System reliability analysis using dominant failure modes identified by selective searching technique

    International Nuclear Information System (INIS)

    Kim, Dong-Seok; Ok, Seung-Yong; Song, Junho; Koh, Hyun-Moo

    2013-01-01

    The failure of a redundant structural system is often described by innumerable system failure modes such as combinations or sequences of local failures. An efficient approach is proposed to identify dominant failure modes in the space of random variables, and then perform system reliability analysis to compute the system failure probability. To identify dominant failure modes in the decreasing order of their contributions to the system failure probability, a new simulation-based selective searching technique is developed using a genetic algorithm. The system failure probability is computed by a multi-scale matrix-based system reliability (MSR) method. Lower-scale MSR analyses evaluate the probabilities of the identified failure modes and their statistical dependence. A higher-scale MSR analysis evaluates the system failure probability based on the results of the lower-scale analyses. Three illustrative examples demonstrate the efficiency and accuracy of the approach through comparison with existing methods and Monte Carlo simulations. The results show that the proposed method skillfully identifies the dominant failure modes, including those neglected by existing approaches. The multi-scale MSR method accurately evaluates the system failure probability with statistical dependence fully considered. The decoupling between the failure mode identification and the system reliability evaluation allows for effective applications to larger structural systems

  3. Social Learning Network Analysis Model to Identify Learning Patterns Using Ontology Clustering Techniques and Meaningful Learning

    Science.gov (United States)

    Firdausiah Mansur, Andi Besse; Yusof, Norazah

    2013-01-01

    Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…

  4. Identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis by using the Delphi Technique

    Science.gov (United States)

    Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Ng, E. G.

    2018-02-01

    This paper explains the process carried out in identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the technical relevance standpoint. Three statements describing the relevant features of NDCDB for spatial analysis were established after three rounds of consensus building. It highlighted the NDCDB’s characteristics such as its spatial accuracy, functions, and criteria as a facilitating tool for spatial analysis. By recognising the relevant features of NDCDB for spatial analysis in this study, practical application of NDCDB for various analysis and purpose can be widely implemented.

  5. Image analysis technique as a tool to identify morphological changes in Trametes versicolor pellets according to exopolysaccharide or laccase production.

    Science.gov (United States)

    Tavares, Ana P M; Silva, Rui P; Amaral, António L; Ferreira, Eugénio C; Xavier, Ana M R B

    2014-02-01

    Image analysis technique was applied to identify morphological changes of pellets from white-rot fungus Trametes versicolor on agitated submerged cultures during the production of exopolysaccharide (EPS) or ligninolytic enzymes. Batch tests with four different experimental conditions were carried out. Two different culture media were used, namely yeast medium or Trametes defined medium and the addition of lignolytic inducers as xylidine or pulp and paper industrial effluent were evaluated. Laccase activity, EPS production, and final biomass contents were determined for batch assays and the pellets morphology was assessed by image analysis techniques. The obtained data allowed establishing the choice of the metabolic pathways according to the experimental conditions, either for laccase enzymatic production in the Trametes defined medium, or for EPS production in the rich Yeast Medium experiments. Furthermore, the image processing and analysis methodology allowed for a better comprehension of the physiological phenomena with respect to the corresponding pellets morphological stages.

  6. Using thermal analysis techniques for identifying the flash point temperatures of some lubricant and base oils

    Directory of Open Access Journals (Sweden)

    Aksam Abdelkhalik

    2018-03-01

    Full Text Available The flash point (FP temperatures of some lubricant and base oils were measured according to ASTM D92 and ASTM D93. In addition, the thermal stability of the oils was studied using differential scanning calorimeter (DSC and thermogravimetric analysis (TGA under nitrogen atmosphere. The DSC results showed that the FP temperatures, for each oil, were found during the first decomposition step and the temperature at the peak of the first decomposition step was usually higher than FP temperatures. The TGA results indicated that the temperature at which 17.5% weigh loss take placed (T17.5% was nearly identical with the FP temperature (±10 °C that was measured according to ASTM D92. The deviation percentage between FP and T17.5% was in the range from −0.8% to 3.6%. Keywords: Flash point, TGA, DSC

  7. Neutron activation analysis techniques for identifying elemental status in Alzheimer's disease

    International Nuclear Information System (INIS)

    Ward, N.I.; Mason, J.A.

    1986-01-01

    Brain tissue (hippocampus and cerebral cortex) from Alzheimer's disease and control individuals sampled from Eastern Canada and the United Kingdom were analyzed for Ag, Al, As, B, Br, Ca, Cd, Co, Cr, Cs, Cu, Fe, Hg, I, K, La, Mg, Mn, Mo, Ni, Rb, S, Sb, Sc, Se, Si, Sn, Sr, Ti, V and Zn. Neutron activation analysis (thermal and prompt gamma-ray) methods were used. Very highly significant differences (S**: probability less than 0.005) for both study areas were shown between Alzheimer's disease (AD) and control (C) individuals: AD>C for Al, Br, Ca and S, and AD< C for Se, V and Zn. Aluminium content of brain tissue ranged form 3.605 to 21.738 μg/g d.w. (AD) and 0.379 to 4.768 μg/g d.w. (C). No statistical evidence of aluminium accumulation with age was noted. Possible zinc deficiency (especially for hippocampal tissue), was observed with zinc ranges of 31.42 to 57.91 μg/g d.w. (AD) and 37.31 to 87.10 μg/g d.w. (C), for Alzheimer's disease patients. (author)

  8. Application of Principal Component Analysis to NIR Spectra of Phyllosilicates: A Technique for Identifying Phyllosilicates on Mars

    Science.gov (United States)

    Rampe, E. B.; Lanza, N. L.

    2012-01-01

    Orbital near-infrared (NIR) reflectance spectra of the martian surface from the OMEGA and CRISM instruments have identified a variety of phyllosilicates in Noachian terrains. The types of phyllosilicates present on Mars have important implications for the aqueous environments in which they formed, and, thus, for recognizing locales that may have been habitable. Current identifications of phyllosilicates from martian NIR data are based on the positions of spectral absorptions relative to laboratory data of well-characterized samples and from spectral ratios; however, some phyllosilicates can be difficult to distinguish from one another with these methods (i.e. illite vs. muscovite). Here we employ a multivariate statistical technique, principal component analysis (PCA), to differentiate between spectrally similar phyllosilicate minerals. PCA is commonly used in a variety of industries (pharmaceutical, agricultural, viticultural) to discriminate between samples. Previous work using PCA to analyze raw NIR reflectance data from mineral mixtures has shown that this is a viable technique for identifying mineral types, abundances, and particle sizes. Here, we evaluate PCA of second-derivative NIR reflectance data as a method for classifying phyllosilicates and test whether this method can be used to identify phyllosilicates on Mars.

  9. MALDI-TOF and SELDI-TOF analysis: “tandem” techniques to identify potential biomarker in fibromyalgia

    Directory of Open Access Journals (Sweden)

    A. Lucacchini

    2011-11-01

    Full Text Available Fibromyalgia (FM is characterized by the presence of chronic widespread pain throughout the musculoskeletal system and diffuse tenderness. Unfortunately, no laboratory tests have been appropriately validated for FM and correlated with the subsets and activity. The aim of this study was to apply a proteomic technique in saliva of FM patients: the Surface Enhance Laser Desorption/Ionization Time-of-Flight (SELDI-TOF. For this study, 57 FM patients and 35 HC patients were enrolled. The proteomic analysis of saliva was carried out using SELDI-TOF. The analysis was performed using different chip arrays with different characteristics of binding. The statistical analysis was performed using cluster analysis and the difference between two groups was underlined using Student’s t-test. Spectra analysis highlighted the presence of several peaks differently expressed in FM patients compared with controls. The preliminary results obtained by SELDI-TOF analysis were compared with those obtained in our previous study performed on whole saliva of FM patients by using electrophoresis. The m/z of two peaks, increased in FM patients, seem to overlap well with the molecular weight of calgranulin A and C and Rho GDP-dissociation inhibitor 2, which we had found up-regulated in our previous study. These preliminary results showed the possibility of identifying potential salivary biomarker through salivary proteomic analysis with MALDI-TOF and SELDI-TOF in FM patients. The peaks observed allow us to focus on some of the particular pathogenic aspects of FM, the oxidative stress which contradistinguishes this condition, the involvement of proteins related to the cytoskeletal arrangements, and central sensibilization.

  10. Identifying Major Techniques of Persuasion.

    Science.gov (United States)

    Makosky, Vivian Parker

    1985-01-01

    The purpose of this class exercise is to increase undergraduate psychology students' awareness of common persuasion techniques used in advertising, including the appeal to or creation of needs, social and prestige suggestion, and the use of emotionally loaded words and images. Television commercials and magazine advertisements are used as…

  11. Application of gene network analysis techniques identifies AXIN1/PDIA2 and endoglin haplotypes associated with bicuspid aortic valve.

    Directory of Open Access Journals (Sweden)

    Eric C Wooten

    2010-01-01

    Full Text Available Bicuspid Aortic Valve (BAV is a highly heritable congenital heart defect. The low frequency of BAV (1% of general population limits our ability to perform genome-wide association studies. We present the application of four a priori SNP selection techniques, reducing the multiple-testing penalty by restricting analysis to SNPs relevant to BAV in a genome-wide SNP dataset from a cohort of 68 BAV probands and 830 control subjects. Two knowledge-based approaches, CANDID and STRING, were used to systematically identify BAV genes, and their SNPs, from the published literature, microarray expression studies and a genome scan. We additionally tested Functionally Interpolating SNPs (fitSNPs present on the array; the fourth consisted of SNPs selected by Random Forests, a machine learning approach. These approaches reduced the multiple testing penalty by lowering the fraction of the genome probed to 0.19% of the total, while increasing the likelihood of studying SNPs within relevant BAV genes and pathways. Three loci were identified by CANDID, STRING, and fitSNPS. A haplotype within the AXIN1-PDIA2 locus (p-value of 2.926x10(-06 and a haplotype within the Endoglin gene (p-value of 5.881x10(-04 were found to be strongly associated with BAV. The Random Forests approach identified a SNP on chromosome 3 in association with BAV (p-value 5.061x10(-06. The results presented here support an important role for genetic variants in BAV and provide support for additional studies in well-powered cohorts. Further, these studies demonstrate that leveraging existing expression and genomic data in the context of GWAS studies can identify biologically relevant genes and pathways associated with a congenital heart defect.

  12. Comparison of quartz crystallographic preferred orientations identified with optical fabric analysis, electron backscatter and neutron diffraction techniques.

    Science.gov (United States)

    Hunter, N J R; Wilson, C J L; Luzin, V

    2017-02-01

    Three techniques are used to measure crystallographic preferred orientations (CPO) in a naturally deformed quartz mylonite: transmitted light cross-polarized microscopy using an automated fabric analyser, electron backscatter diffraction (EBSD) and neutron diffraction. Pole figure densities attributable to crystal-plastic deformation are variably recognizable across the techniques, particularly between fabric analyser and diffraction instruments. Although fabric analyser techniques offer rapid acquisition with minimal sample preparation, difficulties may exist when gathering orientation data parallel with the incident beam. Overall, we have found that EBSD and fabric analyser techniques are best suited for studying CPO distributions at the grain scale, where individual orientations can be linked to their source grain or nearest neighbours. Neutron diffraction serves as the best qualitative and quantitative means of estimating the bulk CPO, due to its three-dimensional data acquisition, greater sample area coverage, and larger sample size. However, a number of sampling methods can be applied to FA and EBSD data to make similar approximations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  13. Nuclear techniques to identify allergenic metals in orthodontic brackets

    International Nuclear Information System (INIS)

    Zenobio, E.G.; Zenobio, M.A.F.; Menezes, M.A.B.C.

    2009-01-01

    The present study determines the elementary alloy composition of ten commercial brands of brackets, especially related to Ni, Cr, and Co metals, confirmed allergenic elements. The nuclear techniques applied in the analyses were X-ray fluorescence (XRF) - Centre National de la Recherche Scientifique, France (National Center of Scientific Research), and X-ray energy spectrometry (XRES), and Instrumental Neutron Activation Analysis (INAA) - CDTN/CNEN, Brazil. The XRES and XRF techniques identified Cr in the 10 samples analyzed and Ni in eight samples. The INAA technique identified the presence of Cr (14% to 19%) and Co (42% to 2400 ppm) in all samples. The semi-quantitative analysis performed by XRF also identified Co in two samples. The techniques were effective in the identification of metals in orthodontic brackets. The elements identified in this study can be considered one of the main reason for the allergic processes among the patients studied. This finding suggests that the patients should be tested for allergy and allergenic sensibility to metals prior to the prescription of orthodontic device. (author)

  14. To what extent can behaviour change techniques be identified within an adaptable implementation package for primary care? A prospective directed content analysis.

    Science.gov (United States)

    Glidewell, Liz; Willis, Thomas A; Petty, Duncan; Lawton, Rebecca; McEachan, Rosemary R C; Ingleson, Emma; Heudtlass, Peter; Davies, Andrew; Jamieson, Tony; Hunter, Cheryl; Hartley, Suzanne; Gray-Burrows, Kara; Clamp, Susan; Carder, Paul; Alderson, Sarah; Farrin, Amanda J; Foy, Robbie

    2018-02-17

    Interpreting evaluations of complex interventions can be difficult without sufficient description of key intervention content. We aimed to develop an implementation package for primary care which could be delivered using typically available resources and could be adapted to target determinants of behaviour for each of four quality indicators: diabetes control, blood pressure control, anticoagulation for atrial fibrillation and risky prescribing. We describe the development and prospective verification of behaviour change techniques (BCTs) embedded within the adaptable implementation packages. We used an over-lapping multi-staged process. We identified evidence-based, candidate delivery mechanisms-mainly audit and feedback, educational outreach and computerised prompts and reminders. We drew upon interviews with primary care professionals using the Theoretical Domains Framework to explore likely determinants of adherence to quality indicators. We linked determinants to candidate BCTs. With input from stakeholder panels, we prioritised likely determinants and intervention content prior to piloting the implementation packages. Our content analysis assessed the extent to which embedded BCTs could be identified within the packages and compared them across the delivery mechanisms and four quality indicators. Each implementation package included at least 27 out of 30 potentially applicable BCTs representing 15 of 16 BCT categories. Whilst 23 BCTs were shared across all four implementation packages (e.g. BCTs relating to feedback and comparing behaviour), some BCTs were unique to certain delivery mechanisms (e.g. 'graded tasks' and 'problem solving' for educational outreach). BCTs addressing the determinants 'environmental context' and 'social and professional roles' (e.g. 'restructuring the social and 'physical environment' and 'adding objects to the environment') were indicator specific. We found it challenging to operationalise BCTs targeting 'environmental context

  15. An evaluation of object-oriented image analysis techniques to identify motorized vehicle effects in semi-arid to arid ecosystems of the American West

    Science.gov (United States)

    Mladinich, C.

    2010-01-01

    Human disturbance is a leading ecosystem stressor. Human-induced modifications include transportation networks, areal disturbances due to resource extraction, and recreation activities. High-resolution imagery and object-oriented classification rather than pixel-based techniques have successfully identified roads, buildings, and other anthropogenic features. Three commercial, automated feature-extraction software packages (Visual Learning Systems' Feature Analyst, ENVI Feature Extraction, and Definiens Developer) were evaluated by comparing their ability to effectively detect the disturbed surface patterns from motorized vehicle traffic. Each package achieved overall accuracies in the 70% range, demonstrating the potential to map the surface patterns. The Definiens classification was more consistent and statistically valid. Copyright ?? 2010 by Bellwether Publishing, Ltd. All rights reserved.

  16. Identifying fly puparia by clearing technique: application to forensic entomology.

    Science.gov (United States)

    Sukontason, Kabkaew L; Ngern-Klun, Radchadawan; Sripakdee, Duanghatai; Sukontason, Kom

    2007-10-01

    In forensic investigations, immature stages of the fly (egg, larva, or puparia) can be used as entomological evidence at death scenes, not only to estimate the postmortem interval (PMI), analyze toxic substances, and to determine the manner of death but also to indicate the movement of a corpse in homicide cases. Of these immature stages, puparia represent the longest developmental time, which makes them of useful. However, in order for forensic entomologists to use puparia effectively, it is crucial that they are able to accurately identify the species of fly found in a corpse. Typically, these puparia are similar in general appearance, being coarctate and light brown to dark brown in color, which makes identification difficult. In this study, we report on the clearing technique used to pale the integument of fly puparia, thereby allowing observation of the anterior end (second to fourth segments) and the profile of the posterior spiracle, which are important clues for identification. We used puparia of the blowfly, Chrysomya megacephala (F.), as the model species in this experiment. With placement in a 20% potassium hydroxide solution daily and mounting on a clearing medium (Permount(R), New Jersey), the profile of the posterior spiracle could be clearly examined under a light microscope beginning on the fifth day after pupation, and the number of papillae in the anterior spiracle could be counted easily starting from the ninth day. Comparison of morphological features of C. megacephala puparia with those of other blowflies (Chrysomya nigripes [Aubertin], Chrysomya rufifacies [Macquart], Chrysomya villeneuvi [Patton], Lucilia cuprina [Wiedemann], and Hemipyrellia ligurriens [Wiedemann]) and a housefly (Musca domestica L.) revealed that the anterior ends and the profiles of the posterior spiracles had markedly distinguishing characteristics. Morphometric analysis of the length and width of puparia, along with the length of the gaps between the posterior spiracles

  17. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  18. Identifying irradiated flour by photo-stimulated luminescence technique

    International Nuclear Information System (INIS)

    Ros Anita Ahmad Ramli; Muhammad Samudi Yasir; Zainon Othman; Wan Saffiey Wan Abdullah

    2013-01-01

    Full-text: The photo-stimulated luminescence technique is recommended by European Committee for standardization for the detection food irradiation (EN 13751:2009). This study shows on luminescence technique to identify gamma irradiated five types of flour (corn flour, tapioca flour, wheat flour, glutinos rice flour and rice flour) at three difference dose levels in the range 0.2 - 1 kGy. The signal level is compare with two thresholds (700 and 5000). The majority of irradiated samples produce a strong signal above the upper threshold (5000 counts/ 60 s). All the control samples gave negative screening result while the signals below the lower threshold (700 counts/ 60s) suggest that the sample has not been irradiated. A few samples show the signal levels between the two thresholds (intermediate signals) suggest that further investigation. Reported procedure was also tested over 60 days, confirming the applicability and feasibility of proposed methods. (author)

  19. Image Techniques for Identifying Sea-Ice Parameters

    Directory of Open Access Journals (Sweden)

    Qin Zhang

    2014-10-01

    Full Text Available The estimation of ice forces are critical to Dynamic Positioning (DP operations in Arctic waters. Ice conditions are important for the analysis of ice-structure interaction in an ice field. To monitor sea-ice conditions, cameras are used as field observation sensors on mobile sensor platforms in Arctic. Various image processing techniques, such as Otsu thresholding, k-means clustering, distance transform, Gradient Vector Flow (GVF Snake, mathematical morphology, are then applied to obtain ice concentration, ice types, and floe size distribution from sea-ice images to ensure safe operations of structures in ice covered regions. Those techniques yield acceptable results, and their effectiveness are demonstrated in case studies.

  20. Identifying Patients with Colon Neoplasias with Gas Discharge Visualization Technique.

    Science.gov (United States)

    Yakovleva, Ekaterina G; Buntseva, Olga A; Belonosov, Sergei S; Fedorov, Eugenii D; Korotkov, Konstantin; Zarubina, Tatiana V

    2015-11-01

    To perform an initial assessment of the potential of using the gas discharge visualization (GDV) technique to identify patients with colon neoplasias. The GDV camera (also known as the electrophotonic imaging camera) was used to assess the participants. Colonoscopy was performed on all 78 participants, followed by a GDV scan. The control group consisted of 22 people. An endoscopic examination identified colon tumors in the remaining 56 participants. Participant ages ranged from 45 to 86 years (mean, 64.6 ± 1.2 years). The study analyzed GDV images of each patient's fingers, presenting a whole-body view, as well as separate sectors corresponding to the organs in question. There was a significant number of differences between the control group and the patients with colon tumors. The dynamic of the parameters was examined as the level of tumor dysplasia (neoplasia) varied. The values of the following parameters decreased in the control group as compared to the patients with cancerous polyps: normalized luminescence area, internal noise, contour radius, and average luminescence intensity. The values of the following parameters increased in the control group: radius of the inscribed circle, contour line length, area of luminescence, contour line fractality, contour line entropy, and form coefficients. This pilot study demonstrated a statistical difference between the GDV parameters of patients with colon tumors and the control group. These findings warrant a more in-depth study of the potential for GDV technique in screening programs.

  1. A technique to identify some typical radio frequency interference using support vector machine

    Science.gov (United States)

    Wang, Yuanchao; Li, Mingtao; Li, Dawei; Zheng, Jianhua

    2017-07-01

    In this paper, we present a technique to automatically identify some typical radio frequency interference from pulsar surveys using support vector machine. The technique has been tested by candidates. In these experiments, to get features of SVM, we use principal component analysis for mosaic plots and its classification accuracy is 96.9%; while we use mathematical morphology operation for smog plots and horizontal stripes plots and its classification accuracy is 86%. The technique is simple, high accurate and useful.

  2. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  3. Thermoluminescence and photostimulated luminescence techniques to identify irradiated foods

    Energy Technology Data Exchange (ETDEWEB)

    Schreiber, G.A. [BgVV - Federal Inst. for Health Protection of Consumers and Veterinary Medicine, Berlin (Germany). Food Irradiation Lab.

    1996-12-31

    Since the publication of the first report about increased thermoluminescence (TL) of irradiated spices, 10 years have passed. At that time, this effect was observed when spices were heated in a TL reader. Meanwhile, the light sources within the foods have been identified and TL methods applied to various foods for identification of irradiation treatment. The methods have been approved by the Ministry of Agriculture, Fisheries and Food (MAFF) of the United Kingdom and included in the Official Collection of Methods according to article 35 of the German Foods Act (LMBG). A draft European Standard has been formulated for approval by the member states of the European Committee for Standardization (CEN) and most importantly, the method is already routinely used food control authorities to identify irradiation treatment of foods. The strength of TL is its radiation specificity. It counts among the most reliable methods and there is potential for becoming the most sensitive physical method for detection of irradiated foods. Research is still going on to speed up performance, to find new applications and to decrease detection limits. Most recently, a new luminescence technique -photostimulated luminescence - has been introduced which can be rapidly performed and which seems to have the potential to be applied to a similar broad range of foods as TL. (author).

  4. [Applying DNA barcoding technique to identify menthae haplocalycis herba].

    Science.gov (United States)

    Pang, Xiaohui; Xu, Haibin; Han, Jianping; Song, Jingyuan

    2012-04-01

    To identify Menthae Haplocalycis Herba and its closely related species using DNA barcoding technique. Total genomic DNA was isolated from Mentha canadensis and its closely related species. Nuclear DNA ITS2 sequences were amplified, and purified PCR products were sequenced. Sequence assembly and consensus sequence generation were performed using the CodonCode Aligner V3.0. The Kimura 2-Parameter (K2P) distances were calculated using software MEGA 5.0. Identification analyses were performed using BLAST1, Nearest Distance and neighbor-joining (NJ) methods. The intra-specific genetic distances of M. canadensis were ranged from 0 to 0.006, which were lower than inter-specific genetic distances between M. canadensis and its closely related species (0.071-0.231). All the three methods showed that ITS2 could discriminate M. canadensis from its closely related species correctly. The ITS2 region is an efficient barcode for identification of Menthae Haplocalycis Herba, which provides a scientific basis for fast and accurate identification of the herb.

  5. Hurdles run technique analysis in the 400m hurdles

    OpenAIRE

    Drtina, Martin

    2010-01-01

    Hurdles run technique analysis in the 400m hurdles Thesis objectives: The main objective is to compare the technique hurdles run in the race tempo on the track 400 m hurdles at the selected probands. Tasks are identified kinematic parameters separately for each proband and identify their weaknesses in technique. Method: Analysis techniques hurdles run was done by using 3D kinematic analysis. Observed space-time events were recorded on two digital cameras. Records was transferred to a suitable...

  6. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  7. A review of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a {open_quotes}sensitivity analysis.{close_quotes} A comprehensive review is presented of more than a dozen sensitivity analysis methods. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.

  8. A New Technique to Identify Arbitrarily Shaped Noise Sources

    Directory of Open Access Journals (Sweden)

    Roberto A. Tenenbaum

    2006-01-01

    Full Text Available Acoustic intensity is one of the available tools for evaluating sound radiation from vibrating bodies. Active intensity may, in some situations, not give a faithful insight about how much energy is in fact carried into the far field. It was then proposed a new parameter, the supersonic acoustic intensity, which takes into account only the intensity generated by components having a smaller wavenumber than the acoustic one. However, the method is only efective for simple sources, such as plane plates, cylinders and spheres. This work presents a new technique, based on the Boundary Elements Method and the Singular Value Decomposition, to compute the supersonic acoustic intensity for arbitrarily shaped sources. The technique is based in the Kirchoff-Helmholtz equation in a discretized approach, leading to a radiation operator that relates the normal velocity on the source's surface mesh with the pressure at grid points located in the field. Then, the singular value decomposition technique is set to the radiation operator and a cutoff criterion is applied to remove non propagating components. Some numerical examples are presented.

  9. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  10. Identifying subgroups of patients using latent class analysis

    DEFF Research Database (Denmark)

    Nielsen, Anne Mølgaard; Kent, Peter; Hestbæk, Lise

    2017-01-01

    BACKGROUND: Heterogeneity in patients with low back pain (LBP) is well recognised and different approaches to subgrouping have been proposed. Latent Class Analysis (LCA) is a statistical technique that is increasingly being used to identify subgroups based on patient characteristics. However...

  11. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  12. Automatically identifying scatter in fluorescence data using robust techniques

    DEFF Research Database (Denmark)

    Engelen, S.; Frosch, Stina; Hubert, M.

    2007-01-01

    First and second order Rayleigh and Raman scatter is a common problem when fitting Parallel Factor Analysis (PARAFAC) to fluorescence excitation-emission data (EEM). The scatter does not contain any relevant chemical information and does not conform to the low-rank trilinear model. The scatter...

  13. Type 1 diabetes: identifying and evaluating patient injection technique.

    Science.gov (United States)

    Spray, Jennifer

    A diagnosis of type 1 diabetes is life changing, both physically and psychologically. This transformation requires a solid rapport between the patient and the diabetic specialist team to ensure the condition is managed successfully. Nevertheless, all general ward nurses should be aware of issues surrounding insulin administration, and thus participate in opportunistic identification, evaluation and empowerment of such patients when hospitalized. Patients that may be mismanaging their condition, irrespective of the length of diagnosis, would then be identified and referred appropriately to the specialist nurse before unnecessary complications arise. It is, however, evident that such measures are overlooked as a result of other constraints. This article explores how the ward is an ideal environment for identifying and evaluating the practical, physical and psychological components of patient insulin administration, through a direct observational approach. Discussion surrounding contributory barriers pertaining to its neglect, proactive implications for practice that potentially could overcome such issues, along with the underpinning pathophysiology, are addressed. Nurses will thus gain a greater perspective concerning the significance of routinely evaluating the competencies of patients' insulin administration within the ward environment.

  14. Algorithms Design Techniques and Analysis

    CERN Document Server

    Alsuwaiyel, M H

    1999-01-01

    Problem solving is an essential part of every scientific discipline. It has two components: (1) problem identification and formulation, and (2) solution of the formulated problem. One can solve a problem on its own using ad hoc techniques or follow those techniques that have produced efficient solutions to similar problems. This requires the understanding of various algorithm design techniques, how and when to use them to formulate solutions and the context appropriate for each of them. This book advocates the study of algorithm design techniques by presenting most of the useful algorithm desi

  15. Data Analysis Techniques for Ligo Detector Characterization

    Science.gov (United States)

    Valdes Sanchez, Guillermo A.

    Gravitational-wave astronomy is a branch of astronomy which aims to use gravitational waves to collect observational data about astronomical objects and events such as black holes, neutron stars, supernovae, and processes including those of the early universe shortly after the Big Bang. Einstein first predicted gravitational waves in the early century XX, but it was not until Septem- ber 14, 2015, that the Laser Interferometer Gravitational-Wave Observatory (LIGO) directly ob- served the first gravitational waves in history. LIGO consists of two twin detectors, one in Livingston, Louisiana and another in Hanford, Washington. Instrumental and sporadic noises limit the sensitivity of the detectors. Scientists conduct Data Quality studies to distinguish a gravitational-wave signal from the noise, and new techniques are continuously developed to identify, mitigate, and veto unwanted noise. This work presents the application of data analysis techniques, such as Hilbert-Huang trans- form (HHT) and Kalman filtering (KF), in LIGO detector characterization. We investigated the application of HHT to characterize the gravitational-wave signal of the first detection, we also demonstrated the functionality of HHT identifying noise originated from light being scattered by perturbed surfaces, and we estimated thermo-optical aberration using KF. We put particular attention to the scattering origin application, for which a tool was developed to identify disturbed surfaces originating scattering noise. The results reduced considerably the time to search for the scattering surface and helped LIGO commissioners to mitigate the noise.

  16. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    Science.gov (United States)

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style

  17. Lidar point density analysis: implications for identifying water bodies

    Science.gov (United States)

    Worstell, Bruce B.; Poppenga, Sandra K.; Evans, Gayla A.; Prince, Sandra

    2014-01-01

    Most airborne topographic light detection and ranging (lidar) systems operate within the near-infrared spectrum. Laser pulses from these systems frequently are absorbed by water and therefore do not generate reflected returns on water bodies in the resulting void regions within the lidar point cloud. Thus, an analysis of lidar voids has implications for identifying water bodies. Data analysis techniques to detect reduced lidar return densities were evaluated for test sites in Blackhawk County, Iowa, and Beltrami County, Minnesota, to delineate contiguous areas that have few or no lidar returns. Results from this study indicated a 5-meter radius moving window with fewer than 23 returns (28 percent of the moving window) was sufficient for delineating void regions. Techniques to provide elevation values for void regions to flatten water features and to force channel flow in the downstream direction also are presented.

  18. Análise comparativa de fragmentos identificáveis de forrageiras, pela técnica micro-histológica Comparative analysis of identifiable fragments of forages, by the microhistological technique

    Directory of Open Access Journals (Sweden)

    Maristela de Oliveira Bauer

    2005-12-01

    Full Text Available Objetivou-se, com este trabalho, verificar, pela técnica micro-histológica, diferenças entre espécies forrageiras quanto ao percentual de fragmentos identificáveis, em função do processo digestivo e da época do ano. Lâminas foliares frescas recém-expandidas, correspondentes à última e à penúltima posição no perfilho, das espécies Melinis minutiflora Pal. de Beauv (capim-gordura, Hyparrhenia rufa (Nees Stapf. (capim-jaraguá, Brachiaria decumbens Stapf. (capim-braquiária, Imperata brasiliensis Trin. (capim-sapé, de Medicago sativa L. (alfafa e de Schinus terebenthifolius Raddi (aroeira, amostradas nos períodos chuvoso e seco, foram digeridas in vitro e preparadas de acordo com a técnica micro-histológica. Observou-se que as espécies apresentaram diferenças marcantes na porcentagem de fragmentos identificáveis e que a digestão alterou estas porcentagens em torno de 10 %; que o período de amos­tragem não influenciou a porcentagem de fragmentos identificáveis para a maioria das espécies; que a presença de pigmentos e a adesão da epiderme às células dos tecidos internos da folha prejudicaram a identificação dos fragmentos; e que a digestão melhorou a visualização dos fragmentos dos capins sapé e jaraguá e da aroeira, mas prejudicou a do capim-braquiária e, principalmente, a da alfafa.The objetive of this study was to verify differences among forages species in relation to the percentage of identifiable fragment as affected by the digestion process and season. Fresh last expanded leaf lamina samples of the species Melinis minutiflora Pal. de Beauv (Molassesgrass, Hyparrhenia rufa (Nees Stapf. (Jaraguagrass, Brachiaria decumbens Stapf. (Signalgrass, Imperata brasilienses Trin. (Sapegrass, and foliar laminas of Medicago sativa L. (Alfalfa and Schinus terebenthifolius Raddi (Aroeira, sampled in the rainy and dry seasons, were digested in vitro and prepared according to the microhistological technique. The

  19. Identifying the sources of produced water in the oil field by isotopic techniques

    International Nuclear Information System (INIS)

    Nguyen Minh Quy; Hoang Long; Le Thi Thu Huong; Luong Van Huan; Vo Thi Tuong Hanh

    2014-01-01

    The objective of this study is to identify the sources of the formation water in the Southwest Su-Tu-Den (STD SW) basement reservoir. To achieve the objective, isotopic techniques along with geochemical analysis for chloride, bromide, strontium dissolved in the water were applied. The isotopic techniques used in this study were the determination of water stable isotopes signatures (δ 2 H and (δ 18 O) and of the 87 Sr/ 86 Sr ratio of strontium in rock cutting sample and that dissolved in the formation water. The obtained results showed that the stable isotopes compositions of water in the Lower Miocene was -3‰ and -23‰ for (δ 18 O and (δ 2 H, respectively indicating the primeval nature of seawater in the reservoir. Meanwhile, the isotopic composition of water in the basement was clustered in a range of alternated freshwater with (δ 18 O and (δ 2 H being -(3-4)‰ and -(54-60)‰, respectively). The strontium isotopes ratio for water in the Lower Miocene reservoir was lower compared to that for water in the basement confirming the different natures of the water in the two reservoirs. The obtained results are assured for the techniques applicability, and it is recommended that studies on identification of the flow-path of the formation water in the STD SW basement reservoir should be continued. (author)

  20. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  1. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  2. Identifiable Data Files - Medicare Provider Analysis and ...

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Provider Analysis and Review (MEDPAR) File contains data from claims for services provided to beneficiaries admitted to Medicare certified inpatient...

  3. Development of communications analysis techniques

    Science.gov (United States)

    Shelton, R. D.

    1972-01-01

    Major results from the frequency analysis of system program (FASP) are reported. The FASP procedure was designed to analyze or design linear dynamic systems, but can be used to solve any problem that can be described by a system of linear time invariant differential equations. The program also shows plots of performance changes as design parameters are adjusted. Experimental results on narrowband FM distortion are also reported.

  4. Identifying clinical course patterns in SMS data using cluster analysis.

    Science.gov (United States)

    Kent, Peter; Kongsted, Alice

    2012-07-02

    are alternative ways of managing SMS data and many different methods of cluster analysis. More research is needed, especially head-to-head studies, to identify which technique is best to use under what circumstances.

  5. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  6. Adhesive polypeptides of Staphylococcus aureus identified using a novel secretion library technique in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Holm Liisa

    2011-05-01

    Full Text Available Abstract Background Bacterial adhesive proteins, called adhesins, are frequently the decisive factor in initiation of a bacterial infection. Characterization of such molecules is crucial for the understanding of bacterial pathogenesis, design of vaccines and development of antibacterial drugs. Because adhesins are frequently difficult to express, their characterization has often been hampered. Alternative expression methods developed for the analysis of adhesins, e.g. surface display techniques, suffer from various drawbacks and reports on high-level extracellular secretion of heterologous proteins in Gram-negative bacteria are scarce. These expression techniques are currently a field of active research. The purpose of the current study was to construct a convenient, new technique for identification of unknown bacterial adhesive polypeptides directly from the growth medium of the Escherichia coli host and to identify novel proteinaceous adhesins of the model organism Staphylococcus aureus. Results Randomly fragmented chromosomal DNA of S. aureus was cloned into a unique restriction site of our expression vector, which facilitates secretion of foreign FLAG-tagged polypeptides into the growth medium of E. coli ΔfliCΔfliD, to generate a library of 1663 clones expressing FLAG-tagged polypeptides. Sequence and bioinformatics analyses showed that in our example, the library covered approximately 32% of the S. aureus proteome. Polypeptides from the growth medium of the library clones were screened for binding to a selection of S. aureus target molecules and adhesive fragments of known staphylococcal adhesins (e.g coagulase and fibronectin-binding protein A as well as polypeptides of novel function (e.g. a universal stress protein and phosphoribosylamino-imidazole carboxylase ATPase subunit were detected. The results were further validated using purified His-tagged recombinant proteins of the corresponding fragments in enzyme-linked immunoassay and

  7. Identifying MMORPG Bots: A Traffic Analysis Approach

    Directory of Open Access Journals (Sweden)

    Wen-Chin Chen

    2008-11-01

    Full Text Available Massively multiplayer online role playing games (MMORPGs have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1 the regularity in the release time of client commands, 2 the trend and magnitude of traffic burstiness in multiple time scales, and 3 the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  8. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  9. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Data analysis techniques for gravitational wave observations. S V Dhurandhar ... The performance of some of these techniques on real data obtained will be discussed. Finally, some results on ... S V Dhurandhar1. Inter-University Centre for Astronomy and Astrophysics, Post Bag 4, Ganeshkhind, Pune 411 007, India ...

  10. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  11. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  12. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  13. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  14. Identifying radiotherapy target volumes in brain cancer by image analysis.

    Science.gov (United States)

    Cheng, Kun; Montgomery, Dean; Feng, Yang; Steel, Robin; Liao, Hanqing; McLaren, Duncan B; Erridge, Sara C; McLaughlin, Stephen; Nailon, William H

    2015-10-01

    To establish the optimal radiotherapy fields for treating brain cancer patients, the tumour volume is often outlined on magnetic resonance (MR) images, where the tumour is clearly visible, and mapped onto computerised tomography images used for radiotherapy planning. This process requires considerable clinical experience and is time consuming, which will continue to increase as more complex image sequences are used in this process. Here, the potential of image analysis techniques for automatically identifying the radiation target volume on MR images, and thereby assisting clinicians with this difficult task, was investigated. A gradient-based level set approach was applied on the MR images of five patients with grades II, III and IV malignant cerebral glioma. The relationship between the target volumes produced by image analysis and those produced by a radiation oncologist was also investigated. The contours produced by image analysis were compared with the contours produced by an oncologist and used for treatment. In 93% of cases, the Dice similarity coefficient was found to be between 60 and 80%. This feasibility study demonstrates that image analysis has the potential for automatic outlining in the management of brain cancer patients, however, more testing and validation on a much larger patient cohort is required.

  15. Spatial analysis to identify disparities in Philippine public school facilities

    Directory of Open Access Journals (Sweden)

    Ligaya Leah Figueroa

    2016-01-01

    Full Text Available This paper addresses the issues that affect school building conditions as a case study of the Philippines. Geographic information systems were utilized to investigate the allocation of public school resources and the extent of disparity in education facilities among 75 Philippine provinces. Four clusters of the provinces were identified by applying spatial statistics and regionalization techniques to the public school data. Overall, the building conditions are of high quality in the northern provinces. The greater region of the capital is overcrowded but well maintained. The eastern seaboard region and the southern provinces have poor conditions due to frequent natural calamities and the prolonged civil unrest, respectively. Since the spatial analysis result shows that the school building requirements are largely unmet, some recommendations are proposed so that they can be implemented by the government in order to improve the school facilities and mitigate the existing disparities among the four clusters of the Philippines.

  16. An automated technique to identify potential inappropriate traditional Chinese medicine (TCM) prescriptions.

    Science.gov (United States)

    Yang, Hsuan-Chia; Iqbal, Usman; Nguyen, Phung Anh; Lin, Shen-Hsien; Huang, Chih-Wei; Jian, Wen-Shan; Li, Yu-Chuan

    2016-04-01

    Medication errors such as potential inappropriate prescriptions would induce serious adverse drug events to patients. Information technology has the ability to prevent medication errors; however, the pharmacology of traditional Chinese medicine (TCM) is not as clear as in western medicine. The aim of this study was to apply the appropriateness of prescription (AOP) model to identify potential inappropriate TCM prescriptions. We used the association rule of mining techniques to analyze 14.5 million prescriptions from the Taiwan National Health Insurance Research Database. The disease and TCM (DTCM) and traditional Chinese medicine-traditional Chinese medicine (TCMM) associations are computed by their co-occurrence, and the associations' strength was measured as Q-values, which often referred to as interestingness or life values. By considering the number of Q-values, the AOP model was applied to identify the inappropriate prescriptions. Afterwards, three traditional Chinese physicians evaluated 1920 prescriptions and validated the detected outcomes from the AOP model. Out of 1920 prescriptions, 97.1% of positive predictive value and 19.5% of negative predictive value were shown by the system as compared with those by experts. The sensitivity analysis indicated that the negative predictive value could improve up to 27.5% when the model's threshold changed to 0.4. We successfully applied the AOP model to automatically identify potential inappropriate TCM prescriptions. This model could be a potential TCM clinical decision support system in order to improve drug safety and quality of care. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  18. Use of discriminant analysis to identify propensity for purchasing properties

    Directory of Open Access Journals (Sweden)

    Ricardo Floriani

    2015-03-01

    Full Text Available Properties usually represent a milestone for people and families due to the high added-value when compared with family income. The objective of this study is the proposition of a discrimination model, by a discriminant analysis of people with characteristics (according to independent variables classified as potential buyers of properties, as well as to identify the interest in the use of such property, if it will be assigned to housing or leisure activities such as a cottage or beach house, and/or for investment. Thus, the following research question is proposed: What are the characteristics that better describe the profile of people which intend to acquire properties? The study justifies itself by its economic relevance in the real estate industry, as well as to the players of the real estate Market that may develop products based on the profile of potential customers. As a statistical technique, discriminant analysis was applied to the data gathered by questionnaire, which was sent via e-mail. Three hundred and thirty four responses were gathered. Based on this study, it was observed that it is possible to identify the intention for acquired properties, as well the purpose for acquiring it, for housing or investments.

  19. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  20. Proteomic Analysis of the Soybean Symbiosome Identifies New Symbiotic Proteins*

    Science.gov (United States)

    Clarke, Victoria C.; Loughlin, Patrick C.; Gavrin, Aleksandr; Chen, Chi; Brear, Ella M.; Day, David A.; Smith, Penelope M.C.

    2015-01-01

    Legumes form a symbiosis with rhizobia in which the plant provides an energy source to the rhizobia bacteria that it uses to fix atmospheric nitrogen. This nitrogen is provided to the legume plant, allowing it to grow without the addition of nitrogen fertilizer. As part of the symbiosis, the bacteria in the infected cells of a new root organ, the nodule, are surrounded by a plant-derived membrane, the symbiosome membrane, which becomes the interface between the symbionts. Fractions containing the symbiosome membrane (SM) and material from the lumen of the symbiosome (peribacteroid space or PBS) were isolated from soybean root nodules and analyzed using nongel proteomic techniques. Bicarbonate stripping and chloroform-methanol extraction of isolated SM were used to reduce complexity of the samples and enrich for hydrophobic integral membrane proteins. One hundred and ninety-seven proteins were identified as components of the SM, with an additional fifteen proteins identified from peripheral membrane and PBS protein fractions. Proteins involved in a range of cellular processes such as metabolism, protein folding and degradation, membrane trafficking, and solute transport were identified. These included a number of proteins previously localized to the SM, such as aquaglyceroporin nodulin 26, sulfate transporters, remorin, and Rab7 homologs. Among the proteome were a number of putative transporters for compounds such as sulfate, calcium, hydrogen ions, peptide/dicarboxylate, and nitrate, as well as transporters for which the substrate is not easy to predict. Analysis of the promoter activity for six genes encoding putative SM proteins showed nodule specific expression, with five showing expression only in infected cells. Localization of two proteins was confirmed using GFP-fusion experiments. The data have been deposited to the ProteomeXchange with identifier PXD001132. This proteome will provide a rich resource for the study of the legume-rhizobium symbiosis. PMID

  1. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  2. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  3. Fourier Spectroscopy: A Simple Analysis Technique

    Science.gov (United States)

    Oelfke, William C.

    1975-01-01

    Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)

  4. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  6. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    Science.gov (United States)

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  7. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  8. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  9. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  10. ANALYSIS OF COMPUTER AIDED PROCESS PLANNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Salim A. Saleh

    2013-05-01

    Full Text Available Computer Aided Process Planning ( CAPP has been recognized as playing a key role in Computer Integrated Manufacturing ( CIM . It was used as a bridge to link CAD with CAM systems, in order to give the possibility of full integration in agreement with computer engineering to introduce CIM. The benefits of CAPP in the real industrial environment are still to be achieved. Due to different manufacturing applications, many different CAPP systems have been developed. The development of CAPP techniques needs to a summarized classification and a descriptive analysis. This paper presents the most important and famous techniques for the available CAPP systems, which are based on the variant, generative or semi-generative methods, and a descriptive analysis of their application possibilities.

  11. An analysis of induction motor testing techniques

    International Nuclear Information System (INIS)

    Soergel, S.

    1996-01-01

    There are two main failure mechanisms in induction motors: bearing related and stator related. The Electric Power Research Institute (EPRI) conducted a study which was completed in 1985, and found that near 37% of all failures were attributed to stator problems. Another data source for motor failures is the Nuclear Plant Reliability Data System (NPRDS). This database reveals that approximately 55% of all motors were identified as being degraded before failure occurred. Of these, approximately 35% were due to electrical faults. These are the faults which this paper will attempt to identify through testing techniques. This paper is a discussion of the current techniques used to predict incipient failure of induction motors. In the past, the main tests were those to assess the integrity of the ground insulation. However, most insulation failures are believed to involve turn or strand insulation, which makes traditional tests alone inadequate for condition assessment. Furthermore, these tests have several limitations which need consideration when interpreting the results. This paper will concentrate on predictive maintenance techniques which detect electrical problems. It will present appropriate methods and tests, and discuss the strengths and weaknesses of each

  12. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  13. Dynamic speckle analysis using multivariate techniques

    International Nuclear Information System (INIS)

    López-Alonso, José M; Alda, Javier; Rabal, Héctor; Grumel, Eduardo; Trivi, Marcelo

    2015-01-01

    In this work we use principal components analysis to characterize dynamic speckle patterns. This analysis quantitatively identifies different dynamics that could be associated to physical phenomena occurring in the sample. We also found the contribution explained by each principal component, or by a group of them. The method analyzes the paint drying process over a hidden topography. It can be used for fast screening and identification of different dynamics in biological or industrial samples by means of dynamic speckle interferometry. (paper)

  14. Accelerometer Data Analysis and Presentation Techniques

    Science.gov (United States)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  15. Comparison of analysis techniques for electromyographic data.

    Science.gov (United States)

    Johnson, J C

    1978-01-01

    Electromyography has been effectively employed to estimate the stress encountered by muscles in performing a variety of functions in the static environment. Such analysis provides the basis for modification of a man-machine system in order to optimize the performances of individual tasks by reducing muscle stress. Myriad analysis methods have been proposed and employed to convert raw electromyographic data into numerical indices of stress and, more specifically, muscle work. However, the type of analysis technique applied to the data can significantly affect the outcome of the experiment. In this study, four methods of analysis are employed to simultaneously process electromyographic data from the flexor muscles of the forearm. The methods of analysis include: 1) integrated EMG (three separate time constants), 2) root mean square voltage, 3) peak height discrimination (three level), and 4) turns counting (two methods). Mechanical stress input as applied to the arm of the subjects includes static load and vibration. The results of the study indicate the comparative sensitivity of each of the techniques to changes in EMG resulting from changes in static and dynamic load on the muscle.

  16. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  17. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  18. Forensic Analysis using Geological and Geochemical Techniques

    Science.gov (United States)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  19. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  20. Using text-mining techniques in electronic patient records to identify ADRs from medicine use.

    Science.gov (United States)

    Warrer, Pernille; Hansen, Ebba Holme; Juhl-Jensen, Lars; Aagaard, Lise

    2012-05-01

    This literature review included studies that use text-mining techniques in narrative documents stored in electronic patient records (EPRs) to investigate ADRs. We searched PubMed, Embase, Web of Science and International Pharmaceutical Abstracts without restrictions from origin until July 2011. We included empirically based studies on text mining of electronic patient records (EPRs) that focused on detecting ADRs, excluding those that investigated adverse events not related to medicine use. We extracted information on study populations, EPR data sources, frequencies and types of the identified ADRs, medicines associated with ADRs, text-mining algorithms used and their performance. Seven studies, all from the United States, were eligible for inclusion in the review. Studies were published from 2001, the majority between 2009 and 2010. Text-mining techniques varied over time from simple free text searching of outpatient visit notes and inpatient discharge summaries to more advanced techniques involving natural language processing (NLP) of inpatient discharge summaries. Performance appeared to increase with the use of NLP, although many ADRs were still missed. Due to differences in study design and populations, various types of ADRs were identified and thus we could not make comparisons across studies. The review underscores the feasibility and potential of text mining to investigate narrative documents in EPRs for ADRs. However, more empirical studies are needed to evaluate whether text mining of EPRs can be used systematically to collect new information about ADRs. © 2011 The Authors. British Journal of Clinical Pharmacology © 2011 The British Pharmacological Society.

  1. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  2. Clustering Analysis within Text Classification Techniques

    Directory of Open Access Journals (Sweden)

    Madalina ZURINI

    2011-01-01

    Full Text Available The paper represents a personal approach upon the main applications of classification which are presented in the area of knowledge based society by means of methods and techniques widely spread in the literature. Text classification is underlined in chapter two where the main techniques used are described, along with an integrated taxonomy. The transition is made through the concept of spatial representation. Having the elementary elements of geometry and the artificial intelligence analysis, spatial representation models are presented. Using a parallel approach, spatial dimension is introduced in the process of classification. The main clustering methods are described in an aggregated taxonomy. For an example, spam and ham words are clustered and spatial represented, when the concepts of spam, ham and common and linkage word are presented and explained in the xOy space representation.

  3. Using text-mining techniques in electronic patient records to identify ADRs from medicine use

    DEFF Research Database (Denmark)

    Warrer, Pernille; Hansen, Ebba Holme; Jensen, Lars Juhl

    2012-01-01

    included empirically based studies on text mining of electronic patient records (EPRs) that focused on detecting ADRs, excluding those that investigated adverse events not related to medicine use. We extracted information on study populations, EPR data sources, frequencies and types of the identified ADRs......, medicines associated with ADRs, text-mining algorithms used and their performance. Seven studies, all from the United States, were eligible for inclusion in the review. Studies were published from 2001, the majority between 2009 and 2010. Text-mining techniques varied over time from simple free text...... searching of outpatient visit notes and inpatient discharge summaries to more advanced techniques involving natural language processing (NLP) of inpatient discharge summaries. Performance appeared to increase with the use of NLP, although many ADRs were still missed. Due to differences in study design...

  4. Identifying sources of atmospheric fine particles in Havana City using Positive Matrix Factorization technique

    International Nuclear Information System (INIS)

    Pinnera, I.; Perez, G.; Ramos, M.; Guibert, R.; Aldape, F.; Flores M, J.; Martinez, M.; Molina, E.; Fernandez, A.

    2011-01-01

    In previous study a set of samples of fine and coarse airborne particulate matter collected in a urban area of Havana City were analyzed by Particle-Induced X-ray Emission (PIXE) technique. The concentrations of 14 elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br and Pb) were consistently determined in both particle sizes. The analytical database provided by PIXE was statistically analyzed in order to determine the local pollution sources. The Positive Matrix Factorization (PMF) technique was applied to fine particle data in order to identify possible pollution sources. These sources were further verified by enrichment factor (EF) calculation. A general discussion about these results is presented in this work. (Author)

  5. A three-step technique to correctly identify the trapezium without the need for fluoroscopic imaging.

    Science.gov (United States)

    Jamil, W; McMurtrie, A; Nesbitt, P; Muir, L T

    2012-12-01

    Thumb pain secondary to degenerative arthritis of the carpometacarpal joint of the thumb is a common disabling condition. The key principles of successful basal joint arthroplasty involve trapezial excision, which is required for pain relief, with or without some form of ligament reconstruction. The majority of basal joint reconstructive procedures include partial or complete trapeziectomy, with and without some types of tendon transfer and ligament reconstruction and with or without tendon interposition and/or temporary wire stabilisation. When performing a trapeziectomy, it is important to identify the trapezium correctly before excising it. Excision of the incorrect bone during trapeziectomy for basal joint arthritis of the thumb has been reported within the NHS Litigation Authority database. We describe the senior author's routinely used three-step technique to confirm the identity of the trapezium before excision. This technique has been reliably used in over 300 cases with successful excision of the trapezium without intraoperative fluoroscopy.

  6. The use of environmental monitoring as a technique to identify isotopic enrichment activities

    International Nuclear Information System (INIS)

    Buchmann, Jose Henrique

    2000-01-01

    The use of environmental monitoring as a technique to identify activities related to the nuclear fuel cycle has been proposed, by international organizations, as an additional measure to the safeguards agreements in force. The elements specific for each kind of nuclear activity, or nuclear signatures, inserted in the ecosystem by several transfer paths, can be intercepted with better or worse ability by different live organisms. Depending on the kind of signature of interest, the anthropogenic material identification and quantification require the choice of adequate biologic indicators and, mainly, the use of sophisticated techniques associated with elaborate sample treatments. This work demonstrates the technical viability of using pine needles as bioindicators of nuclear signatures associated with uranium enrichment activities. Additionally, it proposes the use of a technique widely diffused nowadays in the scientific community, the High Resolution Inductively Coupled Plasma Mass Spectrometer (HR-ICP-MS), to identify the signature corresponding to that kind of activities in the ecosystem. It can be also found a description of a methodology recently being applied in analytical chemistry,based on uncertainties estimates metrological concepts, used to calculate the uncertainties associated with the obtained measurement results. Nitric acid solutions with a concentration of 0.3 mol.kg -1 , used to wash pine needles sampled near facilities that manipulate enriched uranium and containing only 0.1 μg.kg -1 of uranium, exhibit a 235 U: 238 U isotopic abundance ratio of 0.0092±0.0002, while solutions originated from samples collected at places located more than 200 km far from activities related to the nuclear fuel cycle exhibit a value of 0.0074±0.0002 for this abundance ratio. Similar results were obtained for samples collected in different places permit to confirm the presence of anthropogenic uranium and demonstrate the viability of using this technique and the

  7. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  8. Micro-Raman spectroscopy a powerful technique to identify crocidolite and erionite fibers in tissue sections

    Science.gov (United States)

    Rinaudo, C.; Croce, A.; Allegrina, M.; Baris, I. Y.; Dogan, A.; Powers, A.; Rivera, Z.; Bertino, P.; Yang, H.; Gaudino, G.; Carbone, M.

    2013-05-01

    Exposure to mineral fibers such asbestos and erionite is widely associated with the development of lung cancer and pleural malignant mesothelioma (MM). Pedigree and mineralogical studies indicated that genetics may influence mineral fiber carcinogenesis. Although dimensions strongly impact on the fiber carcinogenic potential, also the chemical composition and the fiber is relevant. By using micro-Raman spectroscopy we show here persistence and identification of different mineral phases, directly on histopathological specimens of mice and humans. Fibers of crocidolite asbestos and erionite of different geographic areas (Oregon, US and Cappadocia, Turkey) were injected in mice intra peritoneum. MM developed in 10/15 asbestos-treated mice after 5 months, and in 8-10/15 erionite-treated mice after 14 months. The persistence of the injected fibers was investigated in pancreas, liver, spleen and in the peritoneal tissue. The chemical identification of the different phases occurred in the peritoneal cavity or at the organ borders, while only rarely fibers were localized in the parenchyma. Raman patterns allow easily to recognize crocidolite and erionite fibers. Microscopic analysis revealed that crocidolite fibers were frequently coated by ferruginous material ("asbestos bodies"), whereas erionite fibers were always free from coatings. We also analyzed by micro-Raman spectroscopy lung tissues, both from MM patients of the Cappadocia, where a MM epidemic developed because of environmental exposure to erionite, and from Italian MM patients with occupational exposure to asbestos. Our findings demonstrate that micro-Raman spectroscopy is technique able to identify mineral phases directly on histopathology specimens, as routine tissue sections prepared for diagnostic purpose. REFERENCES A.U. Dogan, M. Dogan. Environ. Geochem. Health 2008, 30(4), 355. M. Carbone, S. Emri, A.U. Dogan, I. Steele, M. Tuncer, HI. Pass, et al. Nat. Rev. Cancer. 2007, 7 (2),147. M. Carbone, Y

  9. [The combined plunger pressure-manometer method. A technique for identifying the peridural space].

    Science.gov (United States)

    Bhate, H

    1992-04-01

    The modified combined plunger pressure and manometer method (KSMM = Kombinierte Stempeldruck-Manometer-Methode) has proved to be a satisfactory alternative to the loss of resistance technique of Dogliotti. The method was tested for practicability and successful identification of the epidural space in 200 patients (80 of them pregnant) by physicians at different stages of their training. It makes it easy for young anaesthetists who are still in training and have not had much experience to learn to identify the epidural space. With this method the experienced operator can make an important contribution to the training of young doctors in epidural anaesthesia without fear of risks and failures.

  10. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  11. Interferogram analysis using Fourier transform techniques

    Science.gov (United States)

    Roddier, Claude; Roddier, Francois

    1987-01-01

    A method of interferogram analysis is described in which Fourier transform techniques are used to map the complex fringe visibility in several types of interferograms. Algorithms are developed for estimation of both the amplitude and the phase of the fringes (yielding the modulus and the phase of the holographically recorded object Fourier transform). The algorithms were applied to the reduction of interferometric seeing measurements (i.e., the estimation of the fringe amplitude only), and the reduction of interferometric tests (i.e., estimation of the fringe phase only). The method was used to analyze scatter-plate interferograms obtained at NOAO.

  12. Extracted image analysis: a technique for deciphering mediated portrayals.

    Science.gov (United States)

    Berg, D H; Coutts, L B

    1995-01-01

    A technique for analyzing print media that we have developed as a consequence of our interest in the portrayal of women in menstrual product advertising is reported. The technique, which we call extracted image analysis, involves a unique application of grounded theory and the concomitant heuristic use of the concept of ideal type (Weber, 1958). It provides a means of heuristically conceptualizing the answer to a variant of the "What is going on here?" question asked in analysis of print communication, that is, "Who is being portrayed/addressed here?" Extracted image analysis involves the use of grounded theory to develop ideal typologies. Because the technique re-constructs the ideal types embedded in a communication, it possesses considerable potential as a means of identifying the profiles of members of identifiable groups held by the producers of the directed messages. In addition, the analysis of such portrayals over time would be particularly well suited to extracted image analysis. A number of other possible applications are also suggested.

  13. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  14. Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.

    Science.gov (United States)

    Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B

    2012-09-15

    Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  16. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  17. Brain imaging analysis can identify participants under regular mental training.

    Science.gov (United States)

    Sato, João R; Kozasa, Elisa H; Russell, Tamara A; Radvany, João; Mello, Luiz E A M; Lacerda, Shirley S; Amaro, Edson

    2012-01-01

    Multivariate pattern recognition approaches have become a prominent tool in neuroimaging data analysis. These methods enable the classification of groups of participants (e.g. controls and patients) on the basis of subtly different patterns across the whole brain. This study demonstrates that these methods can be used, in combination with automated morphometric analysis of structural MRI, to determine with great accuracy whether a single subject has been engaged in regular mental training or not. The proposed approach allowed us to identify with 94.87% accuracy (pimaging applications, in which participants could be identified based on their mental experience.

  18. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  19. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  20. Chromatographic screening techniques in systematic toxicological analysis.

    Science.gov (United States)

    Drummer, O H

    1999-10-15

    A review of techniques used to screen biological specimens for the presence of drugs was conducted with particular reference to systematic toxicological analysis. Extraction systems of both the liquid-liquid and solid-phase type show little apparent difference in their relative ability to extract a range of drugs according to their physio-chemical properties, although mixed-phase SPE extraction is a preferred technique for GC-based applications, and liquid-liquid were preferred for HPLC-based applications. No one chromatographic system has been shown to be capable of detecting a full range of common drugs of abuse, and common ethical drugs, hence two or more assays are required for laboratories wishing to cover a reasonably comprehensive range of drugs of toxicological significance. While immunoassays are invariably used to screen for drugs of abuse, chromatographic systems relying on derivatization and capable of extracting both acidic and basic drugs would be capable of screening a limited range of targeted drugs. Drugs most difficult to detect in systematic toxicological analysis include LSD, psilocin, THC and its metabolites, fentanyl and its designer derivatives, some potent opiates, potent benzodiazepines and some potent neuroleptics, many of the newer anti-convulsants, alkaloids colchicine, amantins, aflatoxins, antineoplastics, coumarin-based anti-coagulants, and a number of cardiovascular drugs. The widespread use of LC-MS and LC-MS-MS for specific drug detection and the emergence of capillary electrophoresis linked to MS and MS-MS provide an exciting possibility for the future to increase the range of drugs detected in any one chromatographic screening system.

  1. Using Data-Driven and Process Mining Techniques for Identifying and Characterizing Problem Gamblers in New Zealand

    Directory of Open Access Journals (Sweden)

    Suriadi Suriadi

    2016-12-01

    Full Text Available This article uses data-driven techniques combined with established theory in order to analyse gambling behavioural patterns of 91 thousand individuals on a real-world fixed-odds gambling dataset in New Zealand. This research uniquely integrates a mixture of process mining, data mining and confirmatory statistical techniques in order to categorise different sub-groups of gamblers, with the explicit motivation of identifying problem gambling behaviours and reporting on the challenges and lessons learned from our case study.We demonstrate how techniques from various disciplines can be combined in order to gain insight into the behavioural patterns exhibited by different types of gamblers, as well as provide assurances of the correctness of our approach and findings. A highlight of this case study is both the methodology which demonstrates how such a combination of techniques provides a rich set of effective tools to undertake an exploratory and open-ended data analysis project that is guided by the process cube concept, as well as the findings themselves which indicate that the contribution that problem gamblers make to the total volume, expenditure, and revenue is higher than previous studies have maintained.

  2. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  3. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  4. Identifying Effective Features and Classifiers for Short Term Rainfall Forecast Using Rough Sets Maximum Frequency Weighted Feature Reduction Technique

    Directory of Open Access Journals (Sweden)

    Sudha Mohankumar

    2016-06-01

    Full Text Available Precise rainfall forecasting is a common challenge across the globe in meteorological predictions. As rainfall forecasting involves rather complex dynamic parameters, an increasing demand for novel approaches to improve the forecasting accuracy has heightened. Recently, Rough Set Theory (RST has attracted a wide variety of scientific applications and is extensively adopted in decision support systems. Although there are several weather prediction techniques in the existing literature, identifying significant input for modelling effective rainfall prediction is not addressed in the present mechanisms. Therefore, this investigation has examined the feasibility of using rough set based feature selection and data mining methods, namely Naïve Bayes (NB, Bayesian Logistic Regression (BLR, Multi-Layer Perceptron (MLP, J48, Classification and Regression Tree (CART, Random Forest (RF, and Support Vector Machine (SVM, to forecast rainfall. Feature selection or reduction process is a process of identifying a significant feature subset, in which the generated subset must characterize the information system as a complete feature set. This paper introduces a novel rough set based Maximum Frequency Weighted (MFW feature reduction technique for finding an effective feature subset for modelling an efficient rainfall forecast system. The experimental analysis and the results indicate substantial improvements of prediction models when trained using the selected feature subset. CART and J48 classifiers have achieved an improved accuracy of 83.42% and 89.72%, respectively. From the experimental study, relative humidity2 (a4 and solar radiation (a6 have been identified as the effective parameters for modelling rainfall prediction.

  5. Identifying Importance-Performance Matrix Analysis (IPMA) of ...

    African Journals Online (AJOL)

    The results of the study revealed that human capital, organizational capital, technological capital and Islamic work ethics significantly influenced business performance. Then, this study explored the use of the Importance-Performance matrix analysis to identify priority factors that can be enhanced to increase business ...

  6. Identifying Students’ Misconceptions on Basic Algorithmic Concepts Through Flowchart Analysis

    NARCIS (Netherlands)

    Rahimi, E.; Barendsen, E.; Henze, I.; Dagienė, V.; Hellas, A.

    2017-01-01

    In this paper, a flowchart-based approach to identifying secondary school students’ misconceptions (in a broad sense) on basic algorithm concepts is introduced. This approach uses student-generated flowcharts as the units of analysis and examines them against plan composition and construct-based

  7. BIOELECTRICAL IMPEDANCE VECTOR ANALYSIS IDENTIFIES SARCOPENIA IN NURSING HOME RESIDENTS

    Science.gov (United States)

    Loss of muscle mass and water shifts between body compartments are contributing factors to frailty in the elderly. The body composition changes are especially pronounced in institutionalized elderly. We investigated the ability of single-frequency bioelectrical impedance analysis (BIA) to identify b...

  8. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  9. Identifying plant cell-surface receptors: combining 'classical' techniques with novel methods.

    Science.gov (United States)

    Uebler, Susanne; Dresselhaus, Thomas

    2014-04-01

    Cell-cell communication during development and reproduction in plants depends largely on a few phytohormones and many diverse classes of polymorphic secreted peptides. The peptide ligands are bound at the cell surface of target cells by their membranous interaction partners representing, in most cases, either receptor-like kinases or ion channels. Although knowledge of both the extracellular ligand and its corresponding receptor(s) is necessary to describe the downstream signalling pathway(s), to date only a few ligand-receptor pairs have been identified. Several methods, such as affinity purification and yeast two-hybrid screens, have been used very successfully to elucidate interactions between soluble proteins, but most of these methods cannot be applied to membranous proteins. Experimental obstacles such as low concentration and poor solubility of membrane receptors, as well as instable transient interactions, often hamper the use of these 'classical' approaches. However, over the last few years, a lot of progress has been made to overcome these problems by combining classical techniques with new methodologies. In the present article, we review the most promising recent methods in identifying cell-surface receptor interactions, with an emphasis on success stories outside the field of plant research.

  10. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  11. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  12. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  13. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  14. Practical identifiability analysis of a minimal cardiovascular system model.

    Science.gov (United States)

    Pironet, Antoine; Docherty, Paul D; Dauby, Pierre C; Chase, J Geoffrey; Desaive, Thomas

    2017-01-17

    Parameters of mathematical models of the cardiovascular system can be used to monitor cardiovascular state, such as total stressed blood volume status, vessel elastance and resistance. To do so, the model parameters have to be estimated from data collected at the patient's bedside. This work considers a seven-parameter model of the cardiovascular system and investigates whether these parameters can be uniquely determined using indices derived from measurements of arterial and venous pressures, and stroke volume. An error vector defined the residuals between the simulated and reference values of the seven clinically available haemodynamic indices. The sensitivity of this error vector to each model parameter was analysed, as well as the collinearity between parameters. To assess practical identifiability of the model parameters, profile-likelihood curves were constructed for each parameter. Four of the seven model parameters were found to be practically identifiable from the selected data. The remaining three parameters were practically non-identifiable. Among these non-identifiable parameters, one could be decreased as much as possible. The other two non-identifiable parameters were inversely correlated, which prevented their precise estimation. This work presented the practical identifiability analysis of a seven-parameter cardiovascular system model, from limited clinical data. The analysis showed that three of the seven parameters were practically non-identifiable, thus limiting the use of the model as a monitoring tool. Slight changes in the time-varying function modeling cardiac contraction and use of larger values for the reference range of venous pressure made the model fully practically identifiable. Copyright © 2017. Published by Elsevier B.V.

  15. Visual Evaluation Techniques for Skill Analysis.

    Science.gov (United States)

    Brown, Eugene W.

    1982-01-01

    Visual evaluation techniques provide the kinesiologist with a method of evaluating physical skill performance. The techniques are divided into five categories: (1) vantage point; (2) movement simplification; (3) balance and stability; (4) movement relationships; and (5) range of movement. (JN)

  16. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  17. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  18. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  19. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    International Nuclear Information System (INIS)

    Kohlhof, Hendrik; Heidt, Christoph; Bähler, Alexandrine; Kohl, Sandro; Gravius, Sascha; Friedrich, Max J.; Ziebarth, Kai; Stranzinger, Enno

    2015-01-01

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  20. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhof, Hendrik, E-mail: Hendrik.Kohlhof@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Heidt, Christoph, E-mail: Christoph.heidt@kispi.uzh.ch [Department of Orthopedic Surgery, University Children' s Hospital Zurich, Steinwiesstrasse 74, 8032 Switzerland (Switzerland); Bähler, Alexandrine, E-mail: Alexandrine.baehler@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Kohl, Sandro, E-mail: sandro.kohl@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Gravius, Sascha, E-mail: sascha.gravius@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Friedrich, Max J., E-mail: Max.Friedrich@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Ziebarth, Kai, E-mail: kai.ziebarth@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Stranzinger, Enno, E-mail: Enno.Stranzinger@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland)

    2015-06-15

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  1. Wavelet transform techniques and signal analysis

    International Nuclear Information System (INIS)

    Perez, R.B.; Mattingly, J.K.; Tennessee Univ., Knoxville, TN; Perez, J.S.

    1993-01-01

    Traditionally, the most widely used signal analysis tool is the Fourier transform which, by producing power spectral densities (PSDs), allows time dependent signals to be studied in the frequency domain. However, the Fourier transform is global -- it extends over the entire time domain -- which makes it ill-suited to study nonstationary signals which exhibit local temporal changes in the signal's frequency content. To analyze nonstationary signals, the family of transforms commonly designated as short-time Fourier transforms (STFTs), capable of identifying temporally localized changes in the signal's frequency content, were developed by employing window functions to isolate temporal regions of the signal. For example, the Gabor STFT uses a Gaussian window. However, the applicability of STFTs is limited by various inadequacies. The Wavelet transform (NW), recently developed by Grossman and Morlet and explored in depth by Daubechies (2) and Mallat, remedies the inadequacies of STFTs. Like the Fourier transform, the WT can be implemented as a discrete transform (DWT) or as a continuous (integral) transform (CWT). This paper briefly illustrates some of the potential applications of the wavelet transform algorithms to signal analysis

  2. Latent cluster analysis of ALS phenotypes identifies prognostically differing groups.

    Directory of Open Access Journals (Sweden)

    Jeban Ganesalingam

    2009-09-01

    Full Text Available Amyotrophic lateral sclerosis (ALS is a degenerative disease predominantly affecting motor neurons and manifesting as several different phenotypes. Whether these phenotypes correspond to different underlying disease processes is unknown. We used latent cluster analysis to identify groupings of clinical variables in an objective and unbiased way to improve phenotyping for clinical and research purposes.Latent class cluster analysis was applied to a large database consisting of 1467 records of people with ALS, using discrete variables which can be readily determined at the first clinic appointment. The model was tested for clinical relevance by survival analysis of the phenotypic groupings using the Kaplan-Meier method.The best model generated five distinct phenotypic classes that strongly predicted survival (p<0.0001. Eight variables were used for the latent class analysis, but a good estimate of the classification could be obtained using just two variables: site of first symptoms (bulbar or limb and time from symptom onset to diagnosis (p<0.00001.The five phenotypic classes identified using latent cluster analysis can predict prognosis. They could be used to stratify patients recruited into clinical trials and generating more homogeneous disease groups for genetic, proteomic and risk factor research.

  3. An expert botanical feature extraction technique based on phenetic features for identifying plant species.

    Directory of Open Access Journals (Sweden)

    Hoshang Kolivand

    Full Text Available In this paper, we present a new method to recognise the leaf type and identify plant species using phenetic parts of the leaf; lobes, apex and base detection. Most of the research in this area focuses on the popular features such as the shape, colour, vein, and texture, which consumes large amounts of computational processing and are not efficient, especially in the Acer database with a high complexity structure of the leaves. This paper is focused on phenetic parts of the leaf which increases accuracy. Detecting the local maxima and local minima are done based on Centroid Contour Distance for Every Boundary Point, using north and south region to recognise the apex and base. Digital morphology is used to measure the leaf shape and the leaf margin. Centroid Contour Gradient is presented to extract the curvature of leaf apex and base. We analyse 32 leaf images of tropical plants and evaluated with two different datasets, Flavia, and Acer. The best accuracy obtained is 94.76% and 82.6% respectively. Experimental results show the effectiveness of the proposed technique without considering the commonly used features with high computational cost.

  4. Progress report on a fast, particle-identifying trigger based on ring-imaging Cherenkov techniques

    International Nuclear Information System (INIS)

    Carroll, J.; Igo, G.; Jacobs, P.; Matis, H.; Naudet, C.; Schroeder, L.S.; Seidl, P.A.; Hallman, T.J.

    1990-01-01

    Experiments which require a large sample of relatively rare events need an efficient (low dead time) trigger that does more than select central collisions. The authors propose to develop a trigger that will permit sophisticated multi-particle identification on a time scale appropriate for the interaction rates expected at RHIC. The visible component of the ring-image produced by an appropriate Cherenkov-radiator-mirror combination is focused onto an array of fast photo-detectors. The output of the photo-array is coupled to a fast pattern recognition system that will identify events containing particles of specified types and angular configurations. As a parallel effort, they propose to develop a spectrum-splitting mirror that will permit the ring-image from a single radiator to be used both in this trigger (the visible component of the image) and in a TMAE containing gas detector (the UV component). The gas detector will provide higher resolution information on particle ID and direction with a delay of a few microseconds. This technique will enable nearly optimal use of the information contained in the Cherenkov spectrum. The authors report progress on the three goals set forth in the proposal: 1. the development of a fast photo-array; 2. the development of a spectrum splitting mirror; and 3. the development and simulation of fast parallel algorithms for ring finding

  5. Assessing Uncertainty in Deep Learning Techniques that Identify Atmospheric Rivers in Climate Simulations

    Science.gov (United States)

    Mahesh, A.; Mudigonda, M.; Kim, S. K.; Kashinath, K.; Kahou, S.; Michalski, V.; Williams, D. N.; Liu, Y.; Prabhat, M.; Loring, B.; O'Brien, T. A.; Collins, W. D.

    2017-12-01

    Atmospheric rivers (ARs) can be the difference between CA facing drought or hurricane-level storms. ARs are a form of extreme weather defined as long, narrow columns of moisture which transport water vapor outside the tropics. When they make landfall, they release the vapor as rain or snow. Convolutional neural networks (CNNs), a machine learning technique that uses filters to recognize features, are the leading computer vision mechanism for classifying multichannel images. CNNs have been proven to be effective in identifying extreme weather events in climate simulation output (Liu et. al. 2016, ABDA'16, http://bit.ly/2hlrFNV). Here, we compare three different CNN architectures, tuned with different hyperparameters and training schemes. We compare two-layer, three-layer, four-layer, and sixteen-layer CNNs' ability to recognize ARs in Community Atmospheric Model version 5 output, and we explore the ability of data augmentation and pre-trained models to increase the accuracy of the classifier. Because pre-training the model with regular images (i.e. benches, stoves, and dogs) yielded the highest accuracy rate, this strategy, also known as transfer learning, may be vital in future scientific CNNs, which likely will not have access to a large labelled training dataset. By choosing the most effective CNN architecture, climate scientists can build an accurate historical database of ARs, which can be used to develop a predictive understanding of these phenomena.

  6. An expert botanical feature extraction technique based on phenetic features for identifying plant species.

    Science.gov (United States)

    Kolivand, Hoshang; Fern, Bong Mei; Rahim, Mohd Shafry Mohd; Sulong, Ghazali; Baker, Thar; Tully, David

    2018-01-01

    In this paper, we present a new method to recognise the leaf type and identify plant species using phenetic parts of the leaf; lobes, apex and base detection. Most of the research in this area focuses on the popular features such as the shape, colour, vein, and texture, which consumes large amounts of computational processing and are not efficient, especially in the Acer database with a high complexity structure of the leaves. This paper is focused on phenetic parts of the leaf which increases accuracy. Detecting the local maxima and local minima are done based on Centroid Contour Distance for Every Boundary Point, using north and south region to recognise the apex and base. Digital morphology is used to measure the leaf shape and the leaf margin. Centroid Contour Gradient is presented to extract the curvature of leaf apex and base. We analyse 32 leaf images of tropical plants and evaluated with two different datasets, Flavia, and Acer. The best accuracy obtained is 94.76% and 82.6% respectively. Experimental results show the effectiveness of the proposed technique without considering the commonly used features with high computational cost.

  7. An expert botanical feature extraction technique based on phenetic features for identifying plant species

    Science.gov (United States)

    Fern, Bong Mei; Rahim, Mohd Shafry Mohd; Sulong, Ghazali; Baker, Thar; Tully, David

    2018-01-01

    In this paper, we present a new method to recognise the leaf type and identify plant species using phenetic parts of the leaf; lobes, apex and base detection. Most of the research in this area focuses on the popular features such as the shape, colour, vein, and texture, which consumes large amounts of computational processing and are not efficient, especially in the Acer database with a high complexity structure of the leaves. This paper is focused on phenetic parts of the leaf which increases accuracy. Detecting the local maxima and local minima are done based on Centroid Contour Distance for Every Boundary Point, using north and south region to recognise the apex and base. Digital morphology is used to measure the leaf shape and the leaf margin. Centroid Contour Gradient is presented to extract the curvature of leaf apex and base. We analyse 32 leaf images of tropical plants and evaluated with two different datasets, Flavia, and Acer. The best accuracy obtained is 94.76% and 82.6% respectively. Experimental results show the effectiveness of the proposed technique without considering the commonly used features with high computational cost. PMID:29420568

  8. Meta-analysis of surgical techniques for preventing parotidectomy sequelae.

    Science.gov (United States)

    Curry, Joseph M; King, Nancy; Reiter, David; Fisher, Kyle; Heffelfinger, Ryan N; Pribitkin, Edmund A

    2009-01-01

    To conduct a meta-analysis of the literature on surgical methods for the prevention of Frey syndrome and concave facial deformity after parotidectomy. A PubMed search through February 2008 identified more than 60 English-language studies involving surgical techniques for prevention of these parameters. Analyzed works included 15 retrospective or prospective controlled studies reporting quantitative data for all included participants for 1 or more of the measured parameters in patients who had undergone parotidectomy. Report quality was assessed by the strength of taxonomy recommendation (SORT) score. Data were directly extracted from reports and dichotomized into positive and negative outcomes. The statistical significance was then calculated. The mean SORT score for all studies was 2.34, and the mean SORT score for all the analyzed studies was 1.88. Meta-analysis for multiple techniques to prevent symptomatic Frey syndrome, positive starch-iodine test results, and contour deformity favored intervention with a cumulative odds ratio (OR) of 3.88 (95% confidence interval [CI], 2.81-5.34); OR, 3.66 (95% CI; 2.32-5.77); and OR, 5.25 (95% CI, 3.57-7.72), respectively. Meta-analysis of operative techniques to prevent symptomatic Frey syndrome, positive starch-iodine test results, and facial asymmetry suggests that such methods are likely to reduce the incidence of these complications after parotidectomy.

  9. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  10. Computer simulation, nuclear techniques and surface analysis

    Directory of Open Access Journals (Sweden)

    Reis, A. D.

    2010-02-01

    Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of 12C and 18O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de 12C y de 18O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.

  11. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  12. Efficient Isothermal Titration Calorimetry Technique Identifies Direct Interaction of Small Molecule Inhibitors with the Target Protein.

    Science.gov (United States)

    Gal, Maayan; Bloch, Itai; Shechter, Nelia; Romanenko, Olga; Shir, Ofer M

    2016-01-01

    Protein-protein interactions (PPI) play a critical role in regulating many cellular processes. Finding novel PPI inhibitors that interfere with specific binding of two proteins is considered a great challenge, mainly due to the complexity involved in characterizing multi-molecular systems and limited understanding of the physical principles governing PPIs. Here we show that the combination of virtual screening techniques, which are capable of filtering a large library of potential small molecule inhibitors, and a unique secondary screening by isothermal titration calorimetry, a label-free method capable of observing direct interactions, is an efficient tool for finding such an inhibitor. In this study we applied this strategy in a search for a small molecule capable of interfering with the interaction of the tumor-suppressor p53 and the E3-ligase MDM2. We virtually screened a library of 15 million small molecules that were filtered to a final set of 80 virtual hits. Our in vitro experimental assay, designed to validate the activity of mixtures of compounds by isothermal titration calorimetry, was used to identify an active molecule against MDM2. At the end of the process the small molecule (4S,7R)-4-(4-chlorophenyl)-5-hydroxy-2,7-dimethyl-N-(6-methylpyridin-2-yl)-4,6,7,8 tetrahydrIoquinoline-3-carboxamide was found to bind MDM2 with a dissociation constant of ~2 µM. Following the identification of this single bioactive compound, spectroscopic measurements were used to further characterize the interaction of the small molecule with the target protein. 2D NMR spectroscopy was used to map the binding region of the small molecule, and fluorescence polarization measurement confirmed that it indeed competes with p53.

  13. Rice Transcriptome Analysis to Identify Possible Herbicide Quinclorac Detoxification Genes

    Directory of Open Access Journals (Sweden)

    Wenying eXu

    2015-09-01

    Full Text Available Quinclorac is a highly selective auxin-type herbicide, and is widely used in the effective control of barnyard grass in paddy rice fields, improving the world’s rice yield. The herbicide mode of action of quinclorac has been proposed and hormone interactions affect quinclorac signaling. Because of widespread use, quinclorac may be transported outside rice fields with the drainage waters, leading to soil and water pollution and environmental health problems.In this study, we used 57K Affymetrix rice whole-genome array to identify quinclorac signaling response genes to study the molecular mechanisms of action and detoxification of quinclorac in rice plants. Overall, 637 probe sets were identified with differential expression levels under either 6 or 24 h of quinclorac treatment. Auxin-related genes such as GH3 and OsIAAs responded to quinclorac treatment. Gene Ontology analysis showed that genes of detoxification-related family genes were significantly enriched, including cytochrome P450, GST, UGT, and ABC and drug transporter genes. Moreover, real-time RT-PCR analysis showed that top candidate P450 families such as CYP81, CYP709C and CYP72A genes were universally induced by different herbicides. Some Arabidopsis genes for the same P450 family were up-regulated under quinclorac treatment.We conduct rice whole-genome GeneChip analysis and the first global identification of quinclorac response genes. This work may provide potential markers for detoxification of quinclorac and biomonitors of environmental chemical pollution.

  14. Proteogenomic Analysis Identifies a Novel Human SHANK3 Isoform

    Directory of Open Access Journals (Sweden)

    Fahad Benthani

    2015-05-01

    Full Text Available Mutations of the SHANK3 gene have been associated with autism spectrum disorder. Individuals harboring different SHANK3 mutations display considerable heterogeneity in their cognitive impairment, likely due to the high SHANK3 transcriptional diversity. In this study, we report a novel interaction between the Mutated in colorectal cancer (MCC protein and a newly identified SHANK3 protein isoform in human colon cancer cells and mouse brain tissue. Hence, our proteogenomic analysis identifies a new human long isoform of the key synaptic protein SHANK3 that was not predicted by the human reference genome. Taken together, our findings describe a potential new role for MCC in neurons, a new human SHANK3 long isoform and, importantly, highlight the use of proteomic data towards the re-annotation of GC-rich genomic regions.

  15. Parameter trajectory analysis to identify treatment effects of pharmacological interventions.

    Directory of Open Access Journals (Sweden)

    Christian A Tiemann

    Full Text Available The field of medical systems biology aims to advance understanding of molecular mechanisms that drive disease progression and to translate this knowledge into therapies to effectively treat diseases. A challenging task is the investigation of long-term effects of a (pharmacological treatment, to establish its applicability and to identify potential side effects. We present a new modeling approach, called Analysis of Dynamic Adaptations in Parameter Trajectories (ADAPT, to analyze the long-term effects of a pharmacological intervention. A concept of time-dependent evolution of model parameters is introduced to study the dynamics of molecular adaptations. The progression of these adaptations is predicted by identifying necessary dynamic changes in the model parameters to describe the transition between experimental data obtained during different stages of the treatment. The trajectories provide insight in the affected underlying biological systems and identify the molecular events that should be studied in more detail to unravel the mechanistic basis of treatment outcome. Modulating effects caused by interactions with the proteome and transcriptome levels, which are often less well understood, can be captured by the time-dependent descriptions of the parameters. ADAPT was employed to identify metabolic adaptations induced upon pharmacological activation of the liver X receptor (LXR, a potential drug target to treat or prevent atherosclerosis. The trajectories were investigated to study the cascade of adaptations. This provided a counter-intuitive insight concerning the function of scavenger receptor class B1 (SR-B1, a receptor that facilitates the hepatic uptake of cholesterol. Although activation of LXR promotes cholesterol efflux and -excretion, our computational analysis showed that the hepatic capacity to clear cholesterol was reduced upon prolonged treatment. This prediction was confirmed experimentally by immunoblotting measurements of SR-B1

  16. Cochlear implant simulator for surgical technique analysis

    Science.gov (United States)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  17. Preliminary physiological characteristics of thermotolerant Saccharomyces cerevisiae clinical isolates identified by molecular biology techniques.

    Science.gov (United States)

    Siedlarz, P; Sroka, M; Dyląg, M; Nawrot, U; Gonchar, M; Kus-Liśkiewicz, M

    2016-03-01

    The aim of the study was a molecular identification and physiological characteristic of the five Saccharomyces cerevisiae strains isolated from patients. The tested isolates were compared with control strains (which are of laboratory or commercial origin). The relation of the isolates to baker's yeast S. cerevisiae was studied using species-specific primers in PCR analysis of the ITS-26S region of DNA. Five isolates were genetically identified as the yeast belonging to the genus S. cerevisiae. The effects of temperature and carbon sources on the growth of the yeast strains were analysed. A quantitative characterization of growth kinetics approve that some tested isolates are thermotolerant and are able to grow at range 37-39°C. Among them, one representative is characterized by the highest specific growth rate (0·637 h(-1) ). In conclusions, some strains are defined as potential candidates to use in the biotechnology due to a higher growth rate at elevated temperatures. Screening for further evaluation of biotechnological significance of the tested isolates will be done (e.g. ethanol and trehalose production at higher temperatures). The physiological characterization and confirmation of species identification by molecular methods for yeasts important in the context of biotechnology industry were demonstrated. Thermotolerant microbial strains are required in various industrial applications, for improving productivity and for decreasing the risk of undesirable contaminations when higher temperatures are used. It is important to search for such strains in extreme environments or exotic niches. In this paper, new thermotolerant strains were identified belonging to the Saccharomyces cerevisiae, but differed from typical bakers' yeast, essentially by their growth rate at higher temperature. The described yeast strains are promising for using in biotechnological industry, especially, for production of ethanol and other products at higher temperatures. © 2015 The

  18. Multidimensional scaling technique for analysis of magnetic storms ...

    Indian Academy of Sciences (India)

    Abstract. Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  19. Analysis of Downs syndrome with molecular techniques for future diagnoses

    Directory of Open Access Journals (Sweden)

    May Salem Al-Nbaheen

    2018-03-01

    Full Text Available Down syndrome (DS is a genetic disorder appeared due to the presence of trisomy in chromosome 21 in the G-group of the acrocentric region. DS is also known as non-Mendelian inheritance, due to the lack of Mendel’s laws. The disorder in children is identified through clinical symptoms and chromosomal analysis and till now there are no biochemical and molecular analyses. Presently, whole exome sequencing (WES has largely contributed in identifying the new disease-causing genes and represented a significant breakthrough in the field of human genetics and this technique uses high throughput sequencing technologies to determine the arrangement of DNA base pairs specifying the protein coding regions of an individual’s genome. Apart from this next generation sequencing and whole genome sequencing also contribute for identifying the disease marker. From this review, the suggestion was to perform the WES is DS children to identify the marker region. Keywords: Downs syndrome, Exome sequencing, Chromosomal analysis, Genes, Genetics

  20. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  1. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  2. A Roadmap of Risk Diagnostic Methods: Developing an Integrated View of Risk Identification and Analysis Techniques

    National Research Council Canada - National Science Library

    Williams, Ray; Ambrose, Kate; Bentrem, Laura

    2004-01-01

    ...), which is envisioned to be a comprehensive reference tool for risk identification and analysis (RI AND A) techniques. Program Managers (PMs) responsible for developing or acquiring software-intensive systems typically identify risks in different ways...

  3. A Technique for Tracking the Reading Rate to Identify the E-Book Reading Behaviors and Comprehension Outcomes of Elementary School Students

    Science.gov (United States)

    Huang, Yueh-Min; Liang, Tsung-Ho

    2015-01-01

    Tracking individual reading behaviors is a difficult task, as is carrying out real-time recording and analysis throughout the reading process, but these aims are worth pursuing. In this study, the reading rate is adopted as an indicator to identify different reading behaviors and comprehension outcomes. A reading rate tracking technique is thus…

  4. Using lexical analysis to identify emotional distress in psychometric schizotypy.

    Science.gov (United States)

    Abplanalp, Samuel J; Buck, Benjamin; Gonzenbach, Virgilio; Janela, Carlos; Lysaker, Paul H; Minor, Kyle S

    2017-09-01

    Through the use of lexical analysis software, researchers have demonstrated a greater frequency of negative affect word use in those with schizophrenia and schizotypy compared to the general population. In addition, those with schizotypy endorse greater emotional distress than healthy controls. In this study, our aim was to expand on previous findings in schizotypy to determine whether negative affect word use could be linked to emotional distress. Schizotypy (n=33) and non-schizotypy groups (n=33) completed an open-ended, semi-structured interview and negative affect word use was analyzed using a validated lexical analysis instrument. Emotional distress was assessed using subjective questionnaires of depression and psychological quality of life (QOL). When groups were compared, those with schizotypy used significantly more negative affect words; endorsed greater depression; and reported lower QOL. Within schizotypy, a trend level association between depression and negative affect word use was observed; QOL and negative affect word use showed a significant inverse association. Our findings offer preliminary evidence of the potential effectiveness of lexical analysis as an objective, behavior-based method for identifying emotional distress throughout the schizophrenia-spectrum. Utilizing lexical analysis in schizotypy offers promise for providing researchers with an assessment capable of objectively detecting emotional distress. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  5. Integrated GRASS GIS based techniques to identify thermal anomalies on water surface. Taranto case study.

    Science.gov (United States)

    Massarelli, Carmine; Matarrese, Raffaella; Felice Uricchio, Vito

    2014-05-01

    In the last years, thermal images collected by airborne systems have made the detection of thermal anomalies possible. These images are an important tool to monitor natural inflows and legal or illegal dumping in coastal waters. By the way, the potential of these kinds of data is not well exploited by the Authorities who supervises the territory. The main reason is the processing of remote sensing data that requires very specialized operators and softwares which are usually expensive and complex. In this study, we adopt a simple methodology that uses GRASS, a free open-source GIS software, which has allowed us to map surface water thermal anomalies and, consequently, to identify and locate coastal inflows, as well as manmade or natural watershed drains or submarine springs (in italian citri) in the Taranto Sea (South of Italy). Taranto sea represents a coastal marine ecosystem that has been gradually modified by mankind. One of its inlet, the Mar Piccolo, is a part of the National Priority List site identified by the National Program of Environmental Remediation and Restoration because of the size and high presence of industrial activities, past and present, that have had and continue to seriously compromise the health status of the population and the environment. In order to detect thermal anomalies, two flights have been performed respectively on March 3rd and on April 7th, 2013. A total of 13 TABI images have been acquired to map the whole Mar Piccolo with 1m of spatial resolution. TABI-320 is an airborne thermal camera by ITRES, with a continuous spectral range between 8 and 12 microns. On July 15th, 2013, an in-situ survey was carried out along the banks to retrieve clear visible points of natural or artificial inflows, detecting up to 72 of discharges. GRASS GIS (Geographic Resources Analysis Support System), is a free and open source Geographic Information System (GIS) software suite used for geospatial data management and analysis, image processing

  6. Cluster analysis of clinical data identifies fibromyalgia subgroups.

    Directory of Open Access Journals (Sweden)

    Elisa Docampo

    Full Text Available INTRODUCTION: Fibromyalgia (FM is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. MATERIAL AND METHODS: 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. RESULTS: VARIABLES CLUSTERED INTO THREE INDEPENDENT DIMENSIONS: "symptomatology", "comorbidities" and "clinical scales". Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1, high symptomatology and comorbidities (Cluster 2, and high symptomatology but low comorbidities (Cluster 3, showing differences in measures of disease severity. CONCLUSIONS: We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment.

  7. Derivatization Technique To Identify Specifically Carbonyl Groups by Infrared Spectroscopy: Characterization of Photooxidative Aging Products in Terpenes and Terpeneous Resins.

    Science.gov (United States)

    Zumbühl, Stefan; Brändle, Andreas; Hochuli, Andreas; Scherrer, Nadim C; Caseri, Walter

    2017-02-07

    Analysis of bioorganic materials by infrared spectroscopy (FT-IR) is frequently limited due to overlapping of diagnostic bands from the various components, which poses a fundamental problem to this analytical technique. The distinction of oxidized di- and triterpenes, for example, is hindered by the superposition of similar absorption bands of carbonyl functional groups summing up to a broad, nondistinctive signal. This study presents a technique for selective fluorination of various carboxylic acids by exposure to gaseous sulfur tetrafluoride. The derivatization treatment leads to characteristic band shifts, allowing the separation of otherwise overlapping bands. Accordingly, the IR bands of primary acids, α,β-unsaturated acids, tertiary acids, peroxy acids, esters, ketones, and α,β-unsaturated ketones are split into distinct absorption bands. The capability of this method is demonstrated on the example of natural resins and their ingredients, which are commonly known to be susceptible to oxidation at ambient conditions. The derivatization method enables one to identify various carbonyl containing functional groups by infrared spectroscopy, even in complex mixtures of terpenes. It unveils previously hidden degradation reactions running in terpenes and natural resins exposed to artificial aging by irradiation with light. New insight is presented on the individual reaction pathways of the terpenes hydroxydammarenone and abietic acid as well as of natural resin varnishes made from dammar and colophony.

  8. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  9. Mechanisms of subsidence for induced damage and techniques for analysis

    International Nuclear Information System (INIS)

    Drumm, E.C.; Bennett, R.M.; Kane, W.F.

    1988-01-01

    Structural damage due to mining induced subsidence is a function of the nature of the structure and its position on the subsidence profile. A point on the profile may be in the tensile zone, the compressive zone, or the no-deformation zone at the bottom of the profile. Damage to structures in the tension zone is primarily due to a reduction of support during vertical displacement of the ground surface, and to shear stresses between the soil and structure resulting from horizontal displacements. The damage mechanisms due to tension can be investigated effectively using a two-dimensional plane stress analysis. Structures in the compression zone are subjected to positive moments in the footing and large compressive horizontal stresses in the foundation walls. A plane strain analysis of the foundation wall is utilized to examine compression zone damage mechanisms. The structural aspects affecting each mechanism are identified and potential mitigation techniques are summarized

  10. Recent trends in particle size analysis techniques

    Science.gov (United States)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  11. Inverse Filtering Techniques in Speech Analysis | Nwachuku ...

    African Journals Online (AJOL)

    inverse filtering' has been applied. The unifying features of these techniques are presented, namely: 1. a basis in the source-filter theory of speech production, 2. the use of a network whose transfer function is the inverse of the transfer function of ...

  12. Integrating complementary medicine literacy education into Australian medical curricula: Student-identified techniques and strategies for implementation.

    Science.gov (United States)

    Templeman, Kate; Robinson, Anske; McKenna, Lisa

    2015-11-01

    Formal medical education about complementary medicine (CM) that comprises medicinal products/treatments is required due to possible CM interactions with conventional medicines; however, few guidelines exist on design and implementation of such education. This paper reports findings of a constructivist grounded theory method study that identified key strategies for integrating CM literacy education into medical curricula. Analysis of data from interviews with 30 medical students showed that students supported a longitudinal integrative and pluralistic approach to medicine. Awareness of common patient use, evidence, and information relevant to future clinical practice were identified as focus points needed for CM literacy education. Students advocated for interactive case-based, experiential and dialogical didactic techniques that are multiprofessional and student-centred. Suggested strategies provide key elements of CM literacy within research, field-based practice, and didactic teaching over the entirety of the curriculum. CM educational strategies should address CM knowledge deficits and ultimately respond to patients' needs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. On the Use of MCDM Technique for Identifying Key Technology: A Case of Auto Company

    OpenAIRE

    Aliakbar Mazlomi; Rosnah bt. Mohd. Yusuff

    2011-01-01

    In today’s world, technology strategy development for industries is one of the most important tasks in proposing technology roadmap. Moreover, identifying strategic technology is main part of strategydevelopment. This article tries to apply MCDM methods in finding key strategic technologies from identified technologies from in order to provide appropriate technology strategy. TOPSIS method helps in finding key strategic technologies from identified technologies from in order to provide approp...

  14. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  15. Cluster Analysis of Clinical Data Identifies Fibromyalgia Subgroups

    Science.gov (United States)

    Docampo, Elisa; Collado, Antonio; Escaramís, Geòrgia; Carbonell, Jordi; Rivera, Javier; Vidal, Javier; Alegre, José

    2013-01-01

    Introduction Fibromyalgia (FM) is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. Material and Methods 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. Results Variables clustered into three independent dimensions: “symptomatology”, “comorbidities” and “clinical scales”. Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1), high symptomatology and comorbidities (Cluster 2), and high symptomatology but low comorbidities (Cluster 3), showing differences in measures of disease severity. Conclusions We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment. PMID:24098674

  16. Automated network analysis identifies core pathways in glioblastoma.

    Directory of Open Access Journals (Sweden)

    Ethan Cerami

    2010-02-01

    Full Text Available Glioblastoma multiforme (GBM is the most common and aggressive type of brain tumor in humans and the first cancer with comprehensive genomic profiles mapped by The Cancer Genome Atlas (TCGA project. A central challenge in large-scale genome projects, such as the TCGA GBM project, is the ability to distinguish cancer-causing "driver" mutations from passively selected "passenger" mutations.In contrast to a purely frequency based approach to identifying driver mutations in cancer, we propose an automated network-based approach for identifying candidate oncogenic processes and driver genes. The approach is based on the hypothesis that cellular networks contain functional modules, and that tumors target specific modules critical to their growth. Key elements in the approach include combined analysis of sequence mutations and DNA copy number alterations; use of a unified molecular interaction network consisting of both protein-protein interactions and signaling pathways; and identification and statistical assessment of network modules, i.e. cohesive groups of genes of interest with a higher density of interactions within groups than between groups.We confirm and extend the observation that GBM alterations tend to occur within specific functional modules, in spite of considerable patient-to-patient variation, and that two of the largest modules involve signaling via p53, Rb, PI3K and receptor protein kinases. We also identify new candidate drivers in GBM, including AGAP2/CENTG1, a putative oncogene and an activator of the PI3K pathway; and, three additional significantly altered modules, including one involved in microtubule organization. To facilitate the application of our network-based approach to additional cancer types, we make the method freely available as part of a software tool called NetBox.

  17. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  18. Social Network Analysis Identifies Key Participants in Conservation Development.

    Science.gov (United States)

    Farr, Cooper M; Reed, Sarah E; Pejchar, Liba

    2018-03-03

    Understanding patterns of participation in private lands conservation, which is often implemented voluntarily by individual citizens and private organizations, could improve its effectiveness at combating biodiversity loss. We used social network analysis (SNA) to examine participation in conservation development (CD), a private land conservation strategy that clusters houses in a small portion of a property while preserving the remaining land as protected open space. Using data from public records for six counties in Colorado, USA, we compared CD participation patterns among counties and identified actors that most often work with others to implement CDs. We found that social network characteristics differed among counties. The network density, or proportion of connections in the network, varied from fewer than 2 to nearly 15%, and was higher in counties with smaller populations and fewer CDs. Centralization, or the degree to which connections are held disproportionately by a few key actors, was not correlated strongly with any county characteristics. Network characteristics were not correlated with the prevalence of wildlife-friendly design features in CDs. The most highly connected actors were biological and geological consultants, surveyors, and engineers. Our work demonstrates a new application of SNA to land-use planning, in which CD network patterns are examined and key actors are identified. For better conservation outcomes of CD, we recommend using network patterns to guide strategies for outreach and information dissemination, and engaging with highly connected actor types to encourage widespread adoption of best practices for CD design and stewardship.

  19. Network Analysis Identifies Disease-Specific Pathways for Parkinson's Disease.

    Science.gov (United States)

    Monti, Chiara; Colugnat, Ilaria; Lopiano, Leonardo; Chiò, Adriano; Alberio, Tiziana

    2018-01-01

    Neurodegenerative diseases are characterized by the progressive loss of specific neurons in selected regions of the central nervous system. The main clinical manifestation (movement disorders, cognitive impairment, and/or psychiatric disturbances) depends on the neuron population being primarily affected. Parkinson's disease is a common movement disorder, whose etiology remains mostly unknown. Progressive loss of dopaminergic neurons in the substantia nigra causes an impairment of the motor control. Some of the pathogenetic mechanisms causing the progressive deterioration of these neurons are not specific for Parkinson's disease but are shared by other neurodegenerative diseases, like Alzheimer's disease and amyotrophic lateral sclerosis. Here, we performed a meta-analysis of the literature of all the quantitative proteomic investigations of neuronal alterations in different models of Parkinson's disease, Alzheimer's disease, and amyotrophic lateral sclerosis to distinguish between general and Parkinson's disease-specific pattern of neurodegeneration. Then, we merged proteomics data with genetics information from the DisGeNET database. The comparison of gene and protein information allowed us to identify 25 proteins involved uniquely in Parkinson's disease and we verified the alteration of one of them, i.e., transaldolase 1 (TALDO1), in the substantia nigra of 5 patients. By using open-source bioinformatics tools, we identified the biological processes specifically affected in Parkinson's disease, i.e., proteolysis, mitochondrion organization, and mitophagy. Eventually, we highlighted four cellular component complexes mostly involved in the pathogenesis: the proteasome complex, the protein phosphatase 2A, the chaperonins CCT complex, and the complex III of the respiratory chain.

  20. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  1. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  2. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis.

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-03-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches.

  3. Identifying a preservation zone using multicriteria decision analysis

    Energy Technology Data Exchange (ETDEWEB)

    Farashi, A.; Naderi, M.; Parvian, N.

    2016-07-01

    Zoning of a protected area is an approach to partition landscape into various land use units. The management of these landscape units can reduce conflicts caused by human activities. Tandoreh National Park is one of the most biologically diverse, protected areas in Iran. Although the area is generally designed to protect biodiversity, there are many conflicts between biodiversity conservation and human activities. For instance, the area is highly controversial and has been considered as an impediment to local economic development, such as tourism, grazing, road construction, and cultivation. In order to reduce human conflicts with biodiversity conservation in Tandoreh National Park, safe zones need to be established and human activities need to be moved out of the zones. In this study we used a systematic methodology to integrate a participatory process with Geographic Information Systems (GIS) using a multi–criteria decision analysis (MCDA) technique to guide a zoning scheme for the Tandoreh National Park, Iran. Our results show that the northern and eastern parts of the Tandoreh National Park that were close to rural areas and farmlands returned less desirability for selection as a preservation area. Rocky Mountains were the most important and most destructed areas and abandoned plains were the least important criteria for preservation in the area. Furthermore, the results reveal that the land properties were considered to be important for protection based on the obtaine. (Author)

  4. Identifying a preservation zone using multi–criteria decision analysis

    Directory of Open Access Journals (Sweden)

    Farashi, A.

    2016-03-01

    Full Text Available Zoning of a protected area is an approach to partition landscape into various land use units. The management of these landscape units can reduce conflicts caused by human activities. Tandoreh National Park is one of the most biologically diverse, protected areas in Iran. Although the area is generally designed to protect biodiversity, there are many conflicts between biodiversity conservation and human activities. For instance, the area is highly controversial and has been considered as an impediment to local economic development, such as tourism, grazing, road construction, and cultivation. In order to reduce human conflicts with biodiversity conservation in Tandoreh National Park, safe zones need to be established and human activities need to be moved out of the zones. In this study we used a systematic methodology to integrate a participatory process with Geographic Information Systems (GIS using a multi–criteria decision analysis (MCDA technique to guide a zoning scheme for the Tandoreh National Park, Iran. Our results show that the northern and eastern parts of the Tandoreh National Park that were close to rural areas and farmlands returned less desirability for selection as a preservation area. Rocky Mountains were the most important and most destructed areas and abandoned plains were the least important criteria for preservation in the area. Furthermore, the results reveal that the land properties were considered to be important for protection based on the obtaine

  5. Intelligent Technique for Signal Processing to Identify the Brain Disorder for Epilepsy Captures Using Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Gurumurthy Sasikumar

    2016-01-01

    Full Text Available The new direction of understand the signal that is created from the brain organization is one of the main chores in the brain signal processing. Amid all the neurological disorders the human brain epilepsy is measured as one of the extreme prevalent and then programmed artificial intelligence detection technique is an essential due to the crooked and unpredictable nature of happening of epileptic seizures. We proposed an Improved Fuzzy firefly algorithm, which would enhance the classification of the brain signal efficiently with minimum iteration. An important bunching technique created on fuzzy logic is the Fuzzy C means. Together in the feature domain with the spatial domain the features gained after multichannel EEG signals remained combined by means of fuzzy algorithms. And for better precision segmentation process the firefly algorithm is applied to optimize the Fuzzy C-means membership function. Simultaneously for the efficient clustering method the convergence criteria are set. On the whole the proposed technique yields more accurate results and that gives an edge over other techniques. This proposed algorithm result compared with other algorithms like fuzzy c means algorithm and PSO algorithm.

  6. Nanotools and molecular techniques to rapidly identify and fight bacterial infections.

    Science.gov (United States)

    Dinarelli, S; Girasole, M; Kasas, S; Longo, G

    2017-07-01

    Reducing the emergence and spread of antibiotic-resistant bacteria is one of the major healthcare issues of our century. In addition to the increased mortality, infections caused by multi-resistant bacteria drastically enhance the healthcare costs, mainly because of the longer duration of illness and treatment. While in the last 20years, bacterial identification has been revolutionized by the introduction of new molecular techniques, the current phenotypic techniques to determine the susceptibilities of common Gram-positive and Gram-negative bacteria require at least two days from collection of clinical samples. Therefore, there is an urgent need for the development of new technologies to determine rapidly drug susceptibility in bacteria and to achieve faster diagnoses. These techniques would also lead to a better understanding of the mechanisms that lead to the insurgence of the resistance, greatly helping the quest for new antibacterial systems and drugs. In this review, we describe some of the tools most currently used in clinical and microbiological research to study bacteria and to address the challenge of infections. We discuss the most interesting advancements in the molecular susceptibility testing systems, with a particular focus on the many applications of the MALDI-TOF MS system. In the field of the phenotypic characterization protocols, we detail some of the most promising semi-automated commercial systems and we focus on some emerging developments in the field of nanomechanical sensors, which constitute a step towards the development of rapid and affordable point-of-care testing devices and techniques. While there is still no innovative technique that is capable of completely substituting for the conventional protocols and clinical practices, many exciting new experimental setups and tools could constitute the basis of the standard testing package of future microbiological tests. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Trend Filtering Techniques for Time Series Analysis

    OpenAIRE

    López Arias, Daniel

    2016-01-01

    Time series can be found almost everywhere in our lives and because of this being capable of analysing them is an important task. Most of the time series we can think of are quite noisy, being this one of the main problems to extract information from them. In this work we use Trend Filtering techniques to try to remove this noise from a series and understand the underlying trend of the series, that gives us information about the behaviour of the series aside from the particular...

  8. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  9. Advanced Imaging Techniques for Multiphase Flows Analysis

    Science.gov (United States)

    Amoresano, A.; Langella, G.; Di Santo, M.; Iodice, P.

    2017-08-01

    Advanced numerical techniques, such as fuzzy logic and neural networks have been applied in this work to digital images acquired on two applications, a centrifugal pump and a stationary spray in order to define, in a stochastic way, the gas-liquid interface evolution. Starting from the numeric matrix representing the image it is possible to characterize geometrical parameters and the time evolution of the jet. The algorithm used works with the fuzzy logic concept to binarize the chromatist of the pixels, depending them, by using the difference of the light scattering for the gas and the liquid phase.. Starting from a primary fixed threshold, the applied technique, can select the ‘gas’ pixel from the ‘liquid’ pixel and so it is possible define the first most probably boundary lines of the spray. Acquiring continuously the images, fixing a frame rate, a most fine threshold can be select and, at the limit, the most probably geometrical parameters of the jet can be detected.

  10. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  11. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  12. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  13. 48 CFR 215.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for...

  14. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... ensure a fair and reasonable price. Examples of such techniques include, but are not limited to, the... to the cost or price analysis of the service or product being proposed should also be included in the... techniques. (a) General. The objective of proposal analysis is to ensure that the final agreed-to price is...

  15. Effective self-regulation change techniques to promote mental wellbeing among adolescents: a meta-analysis

    NARCIS (Netherlands)

    Genugten, L. van; Dusseldorp, E.; Massey, E.K.; Empelen, P. van

    2017-01-01

    Mental wellbeing is influenced by self-regulation processes. However, little is known on the efficacy of change techniques based on self-regulation to promote mental wellbeing. The aim of this meta-analysis is to identify effective self-regulation techniques (SRTs) in primary and secondary

  16. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  17. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  18. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  19. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    features in the speech process: (i) the resonant structure of the vocal-tract transfer function, i.e, formant analysis,. (ii) the glottal wave,. (iii) the fundamental frequency or pitch of the sound. During the production of speech, the configuration of the articulators: the vocal tract tongue, teeth, lips, etc, changes from one sound to.

  20. Microstructure analysis using SAXS/USAXS techniques

    International Nuclear Information System (INIS)

    Okuda, Hiroshi; Ochiai, Shojiro

    2010-01-01

    Introduction to small-angle X-ray scattering (SAXS) and ultra small-angle X-ray scattering (USAXS) is presented. SAXS is useful for microstructure analysis of age-hardenable alloys containing precipitates with several to several tens of nanometers in size. On the other hand, USAXS is appropriate to examine much larger microstructural heterogeneities, such as inclusions, voids, and large precipitates whose size is typically around one micrometer. Combining these two scattering methods, and sometimes also with diffractions, it is possible to assess the hierarchical structure of the samples in-situ and nondestructively, ranging from phase identification, quantitative analysis of precipitation structures upto their mesoscopic aggregates, large voids and inclusions. From technical viewpoint, USAXS requires some specific instrumentation for its optics. However, once a reasonable measurement was made, the analysis for the intensity is the same as that for conventional SAXS. In the present article, short introduction of conventional SAXS is presented, and then, the analysis is applied for a couple of USAXS data obtained for well-defined oxide particles whose average diameters are expected to be about 0.3 micrometers. (author)

  1. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  2. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  3. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  4. Behavioral metabolomics analysis identifies novel neurochemical signatures in methamphetamine sensitization

    Science.gov (United States)

    Adkins, Daniel E.; McClay, Joseph L.; Vunck, Sarah A.; Batman, Angela M.; Vann, Robert E.; Clark, Shaunna L.; Souza, Renan P.; Crowley, James J.; Sullivan, Patrick F.; van den Oord, Edwin J.C.G.; Beardsley, Patrick M.

    2014-01-01

    Behavioral sensitization has been widely studied in animal models and is theorized to reflect neural modifications associated with human psychostimulant addiction. While the mesolimbic dopaminergic pathway is known to play a role, the neurochemical mechanisms underlying behavioral sensitization remain incompletely understood. In the present study, we conducted the first metabolomics analysis to globally characterize neurochemical differences associated with behavioral sensitization. Methamphetamine-induced sensitization measures were generated by statistically modeling longitudinal activity data for eight inbred strains of mice. Subsequent to behavioral testing, nontargeted liquid and gas chromatography-mass spectrometry profiling was performed on 48 brain samples, yielding 301 metabolite levels per sample after quality control. Association testing between metabolite levels and three primary dimensions of behavioral sensitization (total distance, stereotypy and margin time) showed four robust, significant associations at a stringent metabolome-wide significance threshold (false discovery rate < 0.05). Results implicated homocarnosine, a dipeptide of GABA and histidine, in total distance sensitization, GABA metabolite 4-guanidinobutanoate and pantothenate in stereotypy sensitization, and myo-inositol in margin time sensitization. Secondary analyses indicated that these associations were independent of concurrent methamphetamine levels and, with the exception of the myo-inositol association, suggest a mechanism whereby strain-based genetic variation produces specific baseline neurochemical differences that substantially influence the magnitude of MA-induced sensitization. These findings demonstrate the utility of mouse metabolomics for identifying novel biomarkers, and developing more comprehensive neurochemical models, of psychostimulant sensitization. PMID:24034544

  5. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  6. Methods and Techniques of Sampling, Culturing and Identifying of Subsurface Bacteria

    International Nuclear Information System (INIS)

    Lee, Seung Yeop; Baik, Min Hoon

    2010-11-01

    This report described sampling, culturing and identifying of KURT underground bacteria, which existed as iron-, manganese-, and sulfate-reducing bacteria. The methods of culturing and media preparation were different by bacteria species affecting bacteria growth-rates. It will be possible for the cultured bacteria to be used for various applied experiments and researches in the future

  7. The development of gamma energy identify algorithm for compact radiation sensors using stepwise refinement technique

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Hyun Jun [Div. of Radiation Regulation, Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kim, Ye Won; Kim, Hyun Duk; Cho, Gyu Seong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Yi, Yun [Dept. of of Electronics and Information Engineering, Korea University, Seoul (Korea, Republic of)

    2017-06-15

    A gamma energy identifying algorithm using spectral decomposition combined with smoothing method was suggested to confirm the existence of the artificial radio isotopes. The algorithm is composed by original pattern recognition method and smoothing method to enhance the performance to identify gamma energy of radiation sensors that have low energy resolution. The gamma energy identifying algorithm for the compact radiation sensor is a three-step of refinement process. Firstly, the magnitude set is calculated by the original spectral decomposition. Secondly, the magnitude of modeling error in the magnitude set is reduced by the smoothing method. Thirdly, the expected gamma energy is finally decided based on the enhanced magnitude set as a result of the spectral decomposition with the smoothing method. The algorithm was optimized for the designed radiation sensor composed of a CsI (Tl) scintillator and a silicon pin diode. The two performance parameters used to estimate the algorithm are the accuracy of expected gamma energy and the number of repeated calculations. The original gamma energy was accurately identified with the single energy of gamma radiation by adapting this modeling error reduction method. Also the average error decreased by half with the multi energies of gamma radiation in comparison to the original spectral decomposition. In addition, the number of repeated calculations also decreased by half even in low fluence conditions under 104 (/0.09 cm{sup 2} of the scintillator surface). Through the development of this algorithm, we have confirmed the possibility of developing a product that can identify artificial radionuclides nearby using inexpensive radiation sensors that are easy to use by the public. Therefore, it can contribute to reduce the anxiety of the public exposure by determining the presence of artificial radionuclides in the vicinity.

  8. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  9. The application of two recently developed human reliability techniques to cognitive error analysis

    International Nuclear Information System (INIS)

    Gall, W.

    1990-01-01

    Cognitive error can lead to catastrophic consequences for manned systems, including those whose design renders them immune to the effects of physical slips made by operators. Four such events, pressurized water and boiling water reactor accidents which occurred recently, were analysed. The analysis identifies the factors which contributed to the errors and suggests practical strategies for error recovery or prevention. Two types of analysis were conducted: an unstructured analysis based on the analyst's knowledge of psychological theory, and a structured analysis using two recently-developed human reliability analysis techniques. In general, the structured techniques required less effort to produce results and these were comparable to those of the unstructured analysis. (author)

  10. Synthetic Minority Oversampling Technique and Fractal Dimension for Identifying Multiple Sclerosis

    Science.gov (United States)

    Zhang, Yu-Dong; Zhang, Yin; Phillips, Preetha; Dong, Zhengchao; Wang, Shuihua

    Multiple sclerosis (MS) is a severe brain disease. Early detection can provide timely treatment. Fractal dimension can provide statistical index of pattern changes with scale at a given brain image. In this study, our team used susceptibility weighted imaging technique to obtain 676 MS slices and 880 healthy slices. We used synthetic minority oversampling technique to process the unbalanced dataset. Then, we used Canny edge detector to extract distinguishing edges. The Minkowski-Bouligand dimension was a fractal dimension estimation method and used to extract features from edges. Single hidden layer neural network was used as the classifier. Finally, we proposed a three-segment representation biogeography-based optimization to train the classifier. Our method achieved a sensitivity of 97.78±1.29%, a specificity of 97.82±1.60% and an accuracy of 97.80±1.40%. The proposed method is superior to seven state-of-the-art methods in terms of sensitivity and accuracy.

  11. New analytical techniques for cuticle chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schulten, H.R. [Fachhochschule Fresenius, Dept. of Trace Analysis, Wiesbaden (Germany)

    1994-12-31

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author`s integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  12. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  13. Contribution of radioisotopic techniques to identify sentinel lymph-nodes (SLN) in breast cancer

    International Nuclear Information System (INIS)

    Zarlenga, Ana C.; Katz, Lidia; Armesto, Amparo; Noblia, Cristina; Gorostidi, Susana; Perez, Juan; Parma, Patricia

    2009-01-01

    The SLN (one or several) is the first to receive lymph from a tumor. When a cancer cell comes off the tumor and circulates along the outgoing lymph, it meets a barrier, the SLN that intercepts and destroys it. If not, the cancer cell can stay and reproduce in the SLN making a metastasis which can affect other nodes in the same way. It has been shown that if the original tumor is small there is little chance that the SLN could be invaded and therefore little chance of dissemination to other lymph-nodes. Nowadays due to early detection, breast tumors are smaller than one cm, therefore with such size there is little chance of axillary lymph-nodes being affected. If it is confirmed by histological study that the SLN is free of metastasis, it is not necessary to perform a axillary emptying. This identification of SLNs has been achieved because of the advances of Radioisotopic Techniques, which has been carried out in our Hospital since 1997. We have been adapting this technique to the national supply of equipment and radio compounds always under a reliable and secure way. The aim of this presentation is to highlight the radioisotopic identification of SLNs in clinical investigation in 'Angel H. Roffo Institute', and its daily practice compare with Positron Emission Tomography (PET). By combining Radioisotopic Lymphography, Lymphochromography and intra surgical detection of the SN with Gamma Probe, we have obtained a true negative value of 95% of the SN, with 5% false negative. Due to this method we have included SN study in daily practice breast tumor patients with tumor up to 5 cm of diameter. Comparing this methods result (5% false negative), with the PET results, using 18 F-FDG, that has 33% false negatives, we conclude that a negative result can not replace this method of SN detection. (author)

  14. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    Fassbender, L.; Gregory, R.; Winterfeldt, D. von; John, R.

    1992-01-01

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  15. A simple and successful sonographic technique to identify the sciatic nerve in the parasacral area.

    Science.gov (United States)

    Taha, Ahmad Muhammad

    2012-03-01

    The purpose of this study was to describe detailed sonographic anatomy of the parasacral area for rapid and successful identification of the sciatic nerve. Fifty patients scheduled for knee surgery were included in this observational study. An ultrasound-guided parasacral sciatic nerve block was performed in all patients. The ultrasound probe was placed on an axial plane 8 cm lateral to the uppermost point of the gluteal cleft. Usually, at this level the posterior border of the ischium (PBI), a characteristically curved hyperechoic line, could be identified. The sciatic nerve appeared as a hyperechoic structure just medial to the PBI. The nerve lies deep to the piriformis muscle lateral to the inferior gluteal vessels, and if followed caudally, it rests directly on the back of the ischium. After confirmation with electrical stimulation, a 20-mL mixture of 1% ropivacaine and 1% lidocaine with epinephrine was injected. The sciatic nerve was identified successfully in 48 patients (96%). In those patients, the median time required for its ultrasonographic identification was ten seconds [interquartile range, 8-13.7 sec], and the block success rate was 100%. The described sonographic details of the parasacral area allowed for rapid and successful identification of the sciatic nerve.

  16. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  17. Techniques in micromagnetic simulation and analysis

    Science.gov (United States)

    Kumar, D.; Adeyeye, A. O.

    2017-08-01

    Advances in nanofabrication now allow us to manipulate magnetic material at micro- and nanoscales. As the steps of design, modelling and simulation typically precede that of fabrication, these improvements have also granted a significant boost to the methods of micromagnetic simulations (MSs) and analyses. The increased availability of massive computational resources has been another major contributing factor. Magnetization dynamics at micro- and nanoscale is described by the Landau-Lifshitz-Gilbert (LLG) equation, which is an ordinary differential equation (ODE) in time. Several finite difference method (FDM) and finite element method (FEM) based LLG solvers are now widely use to solve different kind of micromagnetic problems. In this review, we present a few patterns in the ways MSs are being used in the pursuit of new physics. An important objective of this review is to allow one to make a well informed decision on the details of simulation and analysis procedures needed to accomplish a given task using computational micromagnetics. We also examine the effect of different simulation parameters to underscore and extend some best practices. Lastly, we examine different methods of micromagnetic analyses which are used to process simulation results in order to extract physically meaningful and valuable information.

  18. Development of chemical analysis techniques: pt. 3

    International Nuclear Information System (INIS)

    Kim, K.J.; Chi, K.Y.; Choi, G.C.

    1981-01-01

    For the purpose of determining trace rare earths a spectrofluorimetric method has been studied. Except Ce and Tb, the fluorescence intensities are not enough to allow satisfactory analysis. Complexing agents such as tungstate and hexafluoroacetylacetone should be employed to increase fluorescence intensities. As a preliminary experiment for the separation of individual rare earth element and uranium, the distribution coefficient, % S here, are obtained on the Dowex 50 W against HCl concentration by a batch method. These % S data are utilized to obtain elution curves. The % S data showed a minimum at around 4 M HCl. To understand this previously known phenomenon the adsorption of Cl - on Dowex 50 W is examined as a function of HCl concentration and found to be decreasing while % S of rare earths increasing. It is interpreted that Cl - and rare earth ions are moved into the resin phase separately and that the charge and the charge densities of these ions are responsible for the different % S curves. Dehydration appears to play an important role in the upturn of the % S curves at higher HCl concentrations

  19. Method of identifying hairpin DNA probes by partial fold analysis

    Science.gov (United States)

    Miller, Benjamin L [Penfield, NY; Strohsahl, Christopher M [Saugerties, NY

    2009-10-06

    Method of identifying molecular beacons in which a secondary structure prediction algorithm is employed to identify oligonucleotide sequences within a target gene having the requisite hairpin structure. Isolated oligonucleotides, molecular beacons prepared from those oligonucleotides, and their use are also disclosed.

  20. Image processing techniques for identifying Mycobacterium tuberculosis in Ziehl-Neelsen stains

    Science.gov (United States)

    Sadaphal, P.; Rao, J.; Comstock, G. W.; Beg, M. F.

    2009-01-01

    Worldwide, laboratory technicians tediously read sputum smears for tuberculosis (TB) diagnosis. We demonstrate proof of principle of an innovative computational algorithm that successfully recognizes Ziehl-Neelsen (ZN) stained acid-fast bacilli (AFB) in digital images. Automated, multi-stage, color-based Bayesian segmentation identified possible ‘TB objects’, removed artifacts by shape comparison and color-labeled objects as ‘definite’, ‘possible’ or ‘non-TB’, bypassing photomicrographic calibration. Superimposed AFB clusters, extreme stain variation and low depth of field were challenges. Our novel method facilitates electronic diagnosis of TB, permitting wider application in developing countries where fluorescent microscopy is currently inaccessible and unaffordable. We plan refinement and validation in the future. PMID:18419897

  1. 48 CFR 815.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis techniques. 815.404-1 Section 815.404-1 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... techniques. (a) Contracting officers are responsible for the technical and administrative sufficiency of the...

  2. Canalplasty: the technique and the analysis of its results

    NARCIS (Netherlands)

    van Spronsen, Erik; Ebbens, Fenna A.; Mirck, Peter G. B.; van Wettum, Cathelijne H. M.; van der Baan, Sieberen

    2013-01-01

    To describe the technique for canalplasty as performed in the Academic Medical Center, Amsterdam, the Netherlands and to present the results of this technique. Retrospective chart analysis. Charts of patients who underwent a canalplasty prodedure between 2001 and 2010 were reviewed for indication

  3. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  4. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  5. Identifying tropical dry forests extent and succession via the use of machine learning techniques

    Science.gov (United States)

    Li, Wei; Cao, Sen; Campos-Vargas, Carlos; Sanchez-Azofeifa, Arturo

    2017-12-01

    Information on ecosystem services as a function of the successional stage for secondary tropical dry forests (TDFs) is scarce and limited. Secondary TDFs succession is defined as regrowth following a complete forest clearance for cattle growth or agriculture activities. In the context of large conservation initiatives, the identification of the extent, structure and composition of secondary TDFs can serve as key elements to estimate the effectiveness of such activities. As such, in this study we evaluate the use of a Hyperspectral MAPper (HyMap) dataset and a waveform LIDAR dataset for characterization of different levels of intra-secondary forests stages at the Santa Rosa National Park (SRNP) Environmental Monitoring Super Site located in Costa Rica. Specifically, a multi-task learning based machine learning classifier (MLC-MTL) is employed on the first shortwave infrared (SWIR1) of HyMap in order to identify the variability of aboveground biomass of secondary TDFs along a successional gradient. Our paper recognizes that the process of ecological succession is not deterministic but a combination of transitional forests types along a stochastic path that depends on ecological, edaphic, land use, and micro-meteorological conditions, and our results provide a new way to obtain the spatial distribution of three main types of TDFs successional stages.

  6. New analysis technique for K-edge densitometry spectra

    International Nuclear Information System (INIS)

    Hsue, Sin-Tao; Collins, M.L.

    1995-01-01

    A method for simulating absorption edge densitometry has been developed. This program enables one to simulate spectra containing any combination of special nuclear materials (SNM) in solution. The method has been validated with an analysis method using a single SNM in solution or a combination of two types of SNM separated by a Z of 2. A new analysis technique for mixed solutions has been developed. This new technique has broader applications and eliminates the need for bias correction

  7. Identifying the "Right Stuff": An Exploration-Focused Astronaut Job Analysis

    Science.gov (United States)

    Barrett, J. D.; Holland, A. W.; Vessey, W. B.

    2015-01-01

    Industrial and organizational (I/O) psychologists play a key role in NASA astronaut candidate selection through the identification of the competencies necessary to successfully engage in the astronaut job. A set of psychosocial competencies, developed by I/O psychologists during a prior job analysis conducted in 1996 and updated in 2003, were identified as necessary for individuals working and living in the space shuttle and on the International Space Station (ISS). This set of competencies applied to the space shuttle and applies to current ISS missions, but may not apply to longer-duration or long-distance exploration missions. With the 2015 launch of the first 12- month ISS mission and the shift in the 2020s to missions beyond low earth orbit, the type of missions that astronauts will conduct and the environment in which they do their work will change dramatically, leading to new challenges for these crews. To support future astronaut selection, training, and research, I/O psychologists in NASA's Behavioral Health and Performance (BHP) Operations and Research groups engaged in a joint effort to conduct an updated analysis of the astronaut job for current and future operations. This project will result in the identification of behavioral competencies critical to performing the astronaut job, along with relative weights for each of the identified competencies, through the application of job analysis techniques. While this job analysis is being conducted according to job analysis best practices, the project poses a number of novel challenges. These challenges include the need to identify competencies for multiple mission types simultaneously, to evaluate jobs that have no incumbents as they have never before been conducted, and working with a very limited population of subject matter experts. Given these challenges, under the guidance of job analysis experts, we used the following methods to conduct the job analysis and identify the key competencies for current and

  8. A Novel Technique for Identifying Patients with ICU Needs Using Hemodynamic Features

    Directory of Open Access Journals (Sweden)

    A. Jalali

    2012-01-01

    Full Text Available Identification of patients requiring intensive care is a critical issue in clinical treatment. The objective of this study is to develop a novel methodology using hemodynamic features for distinguishing such patients requiring intensive care from a group of healthy subjects. In this study, based on the hemodynamic features, subjects are divided into three groups: healthy, risky and patient. For each of the healthy and patient subjects, the evaluated features are based on the analysis of existing differences between hemodynamic variables: Blood Pressure and Heart Rate. Further, four criteria from the hemodynamic variables are introduced: circle criterion, estimation error criterion, Poincare plot deviation, and autonomic response delay criterion. For each of these criteria, three fuzzy membership functions are defined to distinguish patients from healthy subjects. Furthermore, based on the evaluated criteria, a scoring method is developed. In this scoring method membership degree of each subject is evaluated for the three classifying groups. Then, for each subject, the cumulative sum of membership degree of all four criteria is calculated. Finally, a given subject is classified with the group which has the largest cumulative sum. In summary, the scoring method results in 86% sensitivity, 94.8% positive predictive accuracy and 82.2% total accuracy.

  9. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  10. Integrating subpathway analysis to identify candidate agents for hepatocellular carcinoma

    Directory of Open Access Journals (Sweden)

    Wang J

    2016-03-01

    Full Text Available Jiye Wang,1 Mi Li,2 Yun Wang,3 Xiaoping Liu4 1The Criminal Science and Technology Department, Zhejiang Police College, Hangzhou, Zhejiang Province, 2Department of Nursing, Shandong College of Traditional Chinese Medicine College, Yantai, Shandong Province, 3Office Department of Gastroenterology, The First Affiliated Hospital of Xi’an Jiao Tong University, Xi’an, Shanxi Province, 4Key Laboratory of Systems Biology, Shanghai Institutes for Biological Sciences, Shanghai, People’s Republic of China Abstract: Hepatocellular carcinoma (HCC is the second most common cause of cancer-associated death worldwide, characterized by a high invasiveness and resistance to normal anticancer treatments. The need to develop new therapeutic agents for HCC is urgent. Here, we developed a bioinformatics method to identify potential novel drugs for HCC by integrating HCC-related and drug-affected subpathways. By using the RNA-seq data from the TCGA (The Cancer Genome Atlas database, we first identified 1,763 differentially expressed genes between HCC and normal samples. Next, we identified 104 significant HCC-related subpathways. We also identified the subpathways associated with small molecular drugs in the CMap database. Finally, by integrating HCC-related and drug-affected subpathways, we identified 40 novel small molecular drugs capable of targeting these HCC-involved subpathways. In addition to previously reported agents (ie, calmidazolium, our method also identified potentially novel agents for targeting HCC. We experimentally verified that one of these novel agents, prenylamine, induced HCC cell apoptosis using 3-(4,5-dimethylthiazol-2-yl-2,5-diphenyltetrazolium bromide, an acridine orange/ethidium bromide stain, and electron microscopy. In addition, we found that prenylamine not only affected several classic apoptosis-related proteins, including Bax, Bcl-2, and cytochrome c, but also increased caspase-3 activity. These candidate small molecular drugs

  11. An Integrated Approach to Change the Outcome Part II: Targeted Neuromuscular Training Techniques to Reduce Identified ACL Injury Risk Factors

    Science.gov (United States)

    Myer, Gregory D.; Ford, Kevin R.; Brent, Jensen L.; Hewett, Timothy E.

    2014-01-01

    Prior reports indicate that female athletes who demonstrate high knee abduction moments (KAMs) during landing are more responsive to neuromuscular training designed to reduce KAM. Identification of female athletes who demonstrate high KAM, which accurately identifies those at risk for noncontact anterior cruciate ligament (ACL) injury, may be ideal for targeted neuromuscular training. Specific neuromuscular training targeted to the underlying biomechanical components that increase KAM may provide the most efficient and effective training strategy to reduce noncontact ACL injury risk. The purpose of the current commentary is to provide an integrative approach to identify and target mechanistic underpinnings to increased ACL injury in female athletes. Specific neuromuscular training techniques will be presented that address individual algorithm components related to high knee load landing patterns. If these integrated techniques are employed on a widespread basis, prevention strategies for noncontact ACL injury among young female athletes may prove both more effective and efficient. PMID:22580980

  12. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  13. Structural identifiability analysis of a cardiovascular system model.

    Science.gov (United States)

    Pironet, Antoine; Dauby, Pierre C; Chase, J Geoffrey; Docherty, Paul D; Revie, James A; Desaive, Thomas

    2016-05-01

    The six-chamber cardiovascular system model of Burkhoff and Tyberg has been used in several theoretical and experimental studies. However, this cardiovascular system model (and others derived from it) are not identifiable from any output set. In this work, two such cases of structural non-identifiability are first presented. These cases occur when the model output set only contains a single type of information (pressure or volume). A specific output set is thus chosen, mixing pressure and volume information and containing only a limited number of clinically available measurements. Then, by manipulating the model equations involving these outputs, it is demonstrated that the six-chamber cardiovascular system model is structurally globally identifiable. A further simplification is made, assuming known cardiac valve resistances. Because of the poor practical identifiability of these four parameters, this assumption is usual. Under this hypothesis, the six-chamber cardiovascular system model is structurally identifiable from an even smaller dataset. As a consequence, parameter values computed from limited but well-chosen datasets are theoretically unique. This means that the parameter identification procedure can safely be performed on the model from such a well-chosen dataset. Thus, the model may be considered suitable for use in diagnosis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  14. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  15. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  16. A brain impact stress analysis using advanced discretization meshless techniques.

    Science.gov (United States)

    Marques, Marco; Belinha, Jorge; Dinis, Lúcia Maria Js; Natal Jorge, Renato

    2018-03-01

    This work has the objective to compare the mechanical behaviour of a brain impact using an alternative numerical meshless technique. Thus, a discrete geometrical model of a brain was constructed using medical images. This technique allows to achieve a discretization with realistic geometry, allowing to define locally the mechanical properties according to the medical images colour scale. After defining the discrete geometrical model of the brain, the essential and natural boundary conditions were imposed to reproduce a sudden impact force. The analysis was performed using the finite element analysis and the radial point interpolation method, an advanced discretization technique. The results of both techniques are compared. When compared with the finite element analysis, it was verified that meshless methods possess a higher convergence rate and that they are capable of producing smoother variable fields.

  17. Use of Photogrammetry and Biomechanical Gait analysis to Identify Individuals

    DEFF Research Database (Denmark)

    Larsen, Peter Kastmand; Simonsen, Erik Bruun; Lynnerup, Niels

    Photogrammetry and recognition of gait patterns are valuable tools to help identify perpetrators based on surveillance recordings. We have found that stature but only few other measures have a satisfying reproducibility for use in forensics. Several gait variables with high recognition rates were...

  18. Gene expression analysis identifies global gene dosage sensitivity in cancer

    DEFF Research Database (Denmark)

    Fehrmann, Rudolf S. N.; Karjalainen, Juha M.; Krajewska, Malgorzata

    2015-01-01

    Many cancer-associated somatic copy number alterations (SCNAs) are known. Currently, one of the challenges is to identify the molecular downstream effects of these variants. Although several SCNAs are known to change gene expression levels, it is not clear whether each individual SCNA affects gen...

  19. Identifying clinical course patterns in SMS data using cluster analysis

    DEFF Research Database (Denmark)

    Kent, Peter; Kongsted, Alice

    2012-01-01

    ABSTRACT: BACKGROUND: Recently, there has been interest in using the short message service (SMS or text messaging), to gather frequent information on the clinical course of individual patients. One possible role for identifying clinical course patterns is to assist in exploring clinically importa...

  20. Performance analysis of clustering techniques over microarray data: A case study

    Science.gov (United States)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  1. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  2. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  3. Regional environmental analysis and management: New techniques for current problems

    Science.gov (United States)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  4. [TXRF technique and quantitative analysis of mollusc teeth].

    Science.gov (United States)

    Tian, Y; Liu, K; Wu, X; Zheng, S

    1999-06-01

    Total reflection X-ray fluorescence (TXRF) analysis technique and the instrument with a short path, high efficiency, low power and small volume are briefly presented. The detection limit of the system are at pg-level for Cu and Mo target excitation. Teeth of a marine mollusc were measured quantitatively and the spectrum and analysis results were given.

  5. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  6. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  7. Integrated Analysis Identifies Interaction Patterns between Small Molecules and Pathways

    Science.gov (United States)

    Li, Yan; Li, Weiguo; Chen, Xin; Sun, Jiatong; Chen, Huan; Lv, Sali

    2014-01-01

    Previous studies have indicated that the downstream proteins in a key pathway can be potential drug targets and that the pathway can play an important role in the action of drugs. So pathways could be considered as targets of small molecules. A link map between small molecules and pathways was constructed using gene expression profile, pathways, and gene expression of cancer cell line intervened by small molecules and then we analysed the topological characteristics of the link map. Three link patterns were identified based on different drug discovery implications for breast, liver, and lung cancer. Furthermore, molecules that significantly targeted the same pathways tended to treat the same diseases. These results can provide a valuable reference for identifying drug candidates and targets in molecularly targeted therapy. PMID:25114931

  8. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  9. Association analysis identifies ZNF750 regulatory variants in psoriasis

    Directory of Open Access Journals (Sweden)

    Birnbaum Ramon Y

    2011-12-01

    Full Text Available Abstract Background Mutations in the ZNF750 promoter and coding regions have been previously associated with Mendelian forms of psoriasis and psoriasiform dermatitis. ZNF750 encodes a putative zinc finger transcription factor that is highly expressed in keratinocytes and represents a candidate psoriasis gene. Methods We examined whether ZNF750 variants were associated with psoriasis in a large case-control population. We sequenced the promoter and exon regions of ZNF750 in 716 Caucasian psoriasis cases and 397 Caucasian controls. Results We identified a total of 47 variants, including 38 rare variants of which 35 were novel. Association testing identified two ZNF750 haplotypes associated with psoriasis (p ZNF750 promoter and 5' UTR variants displayed a 35-55% reduction of ZNF750 promoter activity, consistent with the promoter activity reduction seen in a Mendelian psoriasis family with a ZNF750 promoter variant. However, the rare promoter and 5' UTR variants identified in this study did not strictly segregate with the psoriasis phenotype within families. Conclusions Two haplotypes of ZNF750 and rare 5' regulatory variants of ZNF750 were found to be associated with psoriasis. These rare 5' regulatory variants, though not causal, might serve as a genetic modifier of psoriasis.

  10. Using Factor Analysis to Identify Topic Preferences Within MBA Courses

    Directory of Open Access Journals (Sweden)

    Earl Chrysler

    2003-02-01

    Full Text Available This study demonstrates the role of a principal components factor analysis in conducting a gap analysis as to the desired characteristics of business alumni. Typically, gap analyses merely compare the emphases that should be given to areas of inquiry with perceptions of actual emphases. As a result, the focus is upon depth of coverage. A neglected area in need of investigation is the breadth of topic dimensions and their differences between the normative (should offer and the descriptive (actually offer. The implications of factor structures, as well as traditional gap analyses, are developed and discussed in the context of outcomes assessment.

  11. Data analysis techniques: a tool for cumulative exposure assessment.

    Science.gov (United States)

    Lalloué, Benoît; Monnez, Jean-Marie; Padilla, Cindy; Kihal, Wahida; Zmirou-Navier, Denis; Deguen, Séverine

    2015-01-01

    Everyone is subject to environmental exposures from various sources, with negative health impacts (air, water and soil contamination, noise, etc.or with positive effects (e.g. green space). Studies considering such complex environmental settings in a global manner are rare. We propose to use statistical factor and cluster analyses to create a composite exposure index with a data-driven approach, in view to assess the environmental burden experienced by populations. We illustrate this approach in a large French metropolitan area. The study was carried out in the Great Lyon area (France, 1.2 M inhabitants) at the census Block Group (BG) scale. We used as environmental indicators ambient air NO2 annual concentrations, noise levels and proximity to green spaces, to industrial plants, to polluted sites and to road traffic. They were synthesized using Multiple Factor Analysis (MFA), a data-driven technique without a priori modeling, followed by a Hierarchical Clustering to create BG classes. The first components of the MFA explained, respectively, 30, 14, 11 and 9% of the total variance. Clustering in five classes group: (1) a particular type of large BGs without population; (2) BGs of green residential areas, with less negative exposures than average; (3) BGs of residential areas near midtown; (4) BGs close to industries; and (5) midtown urban BGs, with higher negative exposures than average and less green spaces. Other numbers of classes were tested in order to assess a variety of clustering. We present an approach using statistical factor and cluster analyses techniques, which seem overlooked to assess cumulative exposure in complex environmental settings. Although it cannot be applied directly for risk or health effect assessment, the resulting index can help to identify hot spots of cumulative exposure, to prioritize urban policies or to compare the environmental burden across study areas in an epidemiological framework.

  12. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  13. Using Linguistic Analysis to Identify High Performing Teams

    Science.gov (United States)

    2006-06-01

    used in two studies. Specifically, participants were told: Many people have made an extensive analysis into the effects of overpopulation , chemical... pollution , and air and water pollution . A frequent conclusion is that the next 5 to 10 years are critical because if significant changes in our society

  14. Identifying Innovative Interventions to Promote Healthy Eating Using Consumption-Oriented Food Supply Chain Analysis.

    Science.gov (United States)

    Hawkes, Corinna

    2009-07-01

    The mapping and analysis of supply chains is a technique increasingly used to address problems in the food system. Yet such supply chain management has not yet been applied as a means of encouraging healthier diets. Moreover, most policies recommended to promote healthy eating focus on the consumer end of the chain. This article proposes a consumption-oriented food supply chain analysis to identify the changes needed in the food supply chain to create a healthier food environment, measured in terms of food availability, prices, and marketing. Along with established forms of supply chain analysis, the method is informed by a historical overview of how food supply chains have changed over time. The method posits that the actors and actions in the chain are affected by organizational, financial, technological, and policy incentives and disincentives, which can in turn be levered for change. It presents a preliminary example of the supply of Coca-Cola beverages into school vending machines and identifies further potential applications. These include fruit and vegetable supply chains, local food chains, supply chains for health-promoting versions of food products, and identifying financial incentives in supply chains for healthier eating.

  15. A simple technique to identify key recruitment issues in randomised controlled trials: Q-QAT - Quanti-Qualitative Appointment Timing.

    Science.gov (United States)

    Paramasivan, Sangeetha; Strong, Sean; Wilson, Caroline; Campbell, Bruce; Blazeby, Jane M; Donovan, Jenny L

    2015-03-11

    Recruitment to pragmatic randomised controlled trials (RCTs) is acknowledged to be difficult, and few interventions have proved to be effective. Previous qualitative research has consistently revealed that recruiters provide imbalanced information about RCT treatments. However, qualitative research can be time-consuming to apply. Within a programme of research to optimise recruitment and informed consent in challenging RCTs, we developed a simple technique, Q-QAT (Quanti-Qualitative Appointment Timing), to systematically investigate and quantify the imbalance to help identify and address recruitment difficulties. The Q-QAT technique comprised: 1) quantification of time spent discussing the RCT and its treatments using transcripts of audio-recorded recruitment appointments, 2) targeted qualitative research to understand the obstacles to recruitment and 3) feedback to recruiters on opportunities for improvement. This was applied to two RCTs with different clinical contexts and recruitment processes. Comparisons were made across clinical centres, recruiters and specialties. In both RCTs, the Q-QAT technique first identified considerable variations in the time spent by recruiters discussing the RCT and its treatments. The patterns emerging from this initial quantification of recruitment appointments then enabled targeted qualitative research to understand the issues and make suggestions to improve recruitment. In RCT1, presentation of the treatments was balanced, but little time was devoted to describing the RCT. Qualitative research revealed patients would have considered participation, but lacked awareness of the RCT. In RCT2, the balance of treatment presentation varied by specialists and centres. Qualitative research revealed difficulties with equipoise and confidence among recruiters presenting the RCT. The quantitative and qualitative findings were well-received by recruiters and opportunities to improve information provision were discussed. A blind coding

  16. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically....... The efficiency of the Random Decrement technique for the estimation of correlation functions is compared to other equivalent methods (FFT, Direct method). It is shown that the Random Decrement technique can be as much as a hundred times faster than other methods. The theory behind the Random Decrement technique...... is expanded to include both a vector formulation that increases speed considerably, and a new method for the prediction of the variance of the estimated Random Decrement functions. The thesis closes with a number of examples of modal analysis of bridges exposed to natural (ambient) load....

  17. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  18. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  19. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  20. Compartmental analysis of dynamic nuclear medicine data: models and identifiability

    Science.gov (United States)

    Delbary, Fabrice; Garbarino, Sara; Vivaldi, Valentina

    2016-12-01

    Compartmental models based on tracer mass balance are extensively used in clinical and pre-clinical nuclear medicine in order to obtain quantitative information on tracer metabolism in the biological tissue. This paper is the first of a series of two that deal with the problem of tracer coefficient estimation via compartmental modelling in an inverse problem framework. Specifically, here we discuss the identifiability problem for a general n-dimension compartmental system and provide uniqueness results in the case of two-compartment and three-compartment compartmental models. The second paper will utilize this framework in order to show how nonlinear regularization schemes can be applied to obtain numerical estimates of the tracer coefficients in the case of nuclear medicine data corresponding to brain, liver and kidney physiology.

  1. Chemical analysis of a new kinematically identified stellar group .

    Science.gov (United States)

    Ženovienė, R.; Tautvaišienė, G.; Nordström, B.; Stonkutė, E.

    We have started a study of chemical composition of a new kinematically identified group of stars in the Galactic disc. Based on dynamical properties those stars were suspected to belong to a disrupted satellite. The main atmospheric parameters and chemical composition were determined for thirty-two stars from high resolution spectra obtained at the Nordic Optical Telescope with the spectrograph FIES. In this contribution the preliminary results of chemical composition study are presented. The metallicity of the investigated stars lie in the interval -0.2 < [Fe/H] < -0.6, their abundances of oxygen and alpha-elements are overabundant in comparison to the Galactic thin disc dwarfs at this metallicity range. This provides further evidences of their common and possibly extragalactic origin.

  2. Association analysis identifies 65 new breast cancer risk loci

    DEFF Research Database (Denmark)

    Michailidou, Kyriaki; Lindström, Sara; Dennis, Joe

    2017-01-01

    cancer in 122,977 cases and 105,974 controls of European ancestry and 14,068 cases and 13,104 controls of East Asian ancestry. We identified 65 new loci that are associated with overall breast cancer risk at P risk single-nucleotide polymorphisms in these loci fall......Breast cancer risk is influenced by rare coding variants in susceptibility genes, such as BRCA1, and many common, mostly non-coding variants. However, much of the genetic contribution to breast cancer risk remains unknown. Here we report the results of a genome-wide association study of breast......-nucleotide polymorphisms in regulatory features was 2-5-fold enriched relative to the genome-wide average, with strong enrichment for particular transcription factor binding sites. These results provide further insight into genetic susceptibility to breast cancer and will improve the use of genetic risk scores...

  3. Image analysis techniques for the study of turbulent flows

    Science.gov (United States)

    Ferrari, Simone

    In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively "low cost" techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  4. Image analysis techniques for the study of turbulent flows

    Directory of Open Access Journals (Sweden)

    Ferrari Simone

    2017-01-01

    Full Text Available In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively “low cost” techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  5. Book Review: Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects

    Directory of Open Access Journals (Sweden)

    Thomas Nash

    2013-06-01

    Full Text Available Shavers, B. (2013. Placing the Suspect behind the Keyboard: Using Digital Forensics and Investigative Techniques to Identify Cybercrime Suspects. Waltham, MA: Elsevier, 290 pages, ISBN-978-1-59749-985-9, US$51.56. Includes bibliographical references and index.Reviewed by Detective Corporal Thomas Nash (tnash@bpdvt.org, Burlington Vermont Police Department, Internet Crime against Children Task Force. Adjunct Instructor, Champlain College, Burlington VT.In this must read for any aspiring novice cybercrime investigator as well as the seasoned professional computer guru alike, Brett Shaver takes the reader into the ever changing and dynamic world of Cybercrime investigation.  Shaver, an experienced criminal investigator, lays out the details and intricacies of a computer related crime investigation in a clear and concise manner in his new easy to read publication, Placing the Suspect behind the Keyboard. Using Digital Forensics and Investigative techniques to Identify Cybercrime Suspects. Shaver takes the reader from start to finish through each step of the investigative process in well organized and easy to follow sections, with real case file examples to reach the ultimate goal of any investigation: identifying the suspect and proving their guilt in the crime. Do not be fooled by the title. This excellent, easily accessible reference is beneficial to both criminal as well as civil investigations and should be in every investigator’s library regardless of their respective criminal or civil investigative responsibilities.(see PDF for full review

  6. Structural brain change in Attention Deficit Hyperactivity Disorder identified by meta-analysis.

    Science.gov (United States)

    Ellison-Wright, Ian; Ellison-Wright, Zoë; Bullmore, Ed

    2008-06-30

    The authors sought to map gray matter changes in Attention Deficit Hyperactivity Disorder (ADHD) using a novel technique incorporating neuro-imaging and genetic meta-analysis methods. A systematic search was conducted for voxel-based structural magnetic resonance imaging studies of patients with ADHD (or with related disorders) in relation to comparison groups. The authors carried out meta-analyses of the co-ordinates of gray matter differences. For the meta-analyses they hybridised the standard method of Activation Likelihood Estimation (ALE) with the rank approach used in Genome Scan Meta-Analysis (GSMA). This system detects three-dimensional conjunctions of co-ordinates from multiple studies and permits the weighting of studies in relation to sample size. For gray matter decreases, there were 7 studies including a total of 114 patients with ADHD (or related disorders) and 143 comparison subjects. Meta-analysis of these studies identified a significant regional gray matter reduction in ADHD in the right putamen/globus pallidus region. Four studies reported gray matter increases in ADHD but no regional increase was identified by meta-analysis. In ADHD there is gray matter reduction in the right putamen/globus pallidus region. This may be an anatomical marker for dysfunction in frontostriatal circuits mediating cognitive control. Right putamen lesions have been specifically associated with ADHD symptoms after closed head injuries in children.

  7. Structural brain change in Attention Deficit Hyperactivity Disorder identified by meta-analysis

    Directory of Open Access Journals (Sweden)

    Ellison-Wright Zoë

    2008-06-01

    Full Text Available Abstract Background The authors sought to map gray matter changes in Attention Deficit Hyperactivity Disorder (ADHD using a novel technique incorporating neuro-imaging and genetic meta-analysis methods. Methods A systematic search was conducted for voxel-based structural magnetic resonance imaging studies of patients with ADHD (or with related disorders in relation to comparison groups. The authors carried out meta-analyses of the co-ordinates of gray matter differences. For the meta-analyses they hybridised the standard method of Activation Likelihood Estimation (ALE with the rank approach used in Genome Scan Meta-Analysis (GSMA. This system detects three-dimensional conjunctions of co-ordinates from multiple studies and permits the weighting of studies in relation to sample size. Results For gray matter decreases, there were 7 studies including a total of 114 patients with ADHD (or related disorders and 143 comparison subjects. Meta-analysis of these studies identified a significant regional gray matter reduction in ADHD in the right putamen/globus pallidus region. Four studies reported gray matter increases in ADHD but no regional increase was identified by meta-analysis. Conclusion In ADHD there is gray matter reduction in the right putamen/globus pallidus region. This may be an anatomical marker for dysfunction in frontostriatal circuits mediating cognitive control. Right putamen lesions have been specifically associated with ADHD symptoms after closed head injuries in children.

  8. Monoallelic mutation analysis (MAMA) for identifying germline mutations.

    Science.gov (United States)

    Papadopoulos, N; Leach, F S; Kinzler, K W; Vogelstein, B

    1995-09-01

    Dissection of germline mutations in a sensitive and specific manner presents a continuing challenge. In dominantly inherited diseases, mutations occur in only one allele and are often masked by the normal allele. Here we report the development of a sensitive and specific diagnostic strategy based on somatic cell hybridization termed MAMA (monoallelic mutation analysis). We have demonstrated the utility of this strategy in two different hereditary colorectal cancer syndromes, one caused by a defective tumour suppressor gene on chromosome 5 (familial adenomatous polyposis, FAP) and the other caused by a defective mismatch repair gene on chromosome 2 (hereditary non-polyposis colorectal cancer, HNPCC).

  9. Independent component analysis of high-resolution imaging data identifies distinct functional domains

    DEFF Research Database (Denmark)

    Reidl, Juergen; Starke, Jens; Omer, David

    2007-01-01

    . Here we demonstrate that principal component analysis (PCA) followed by spatial independent component analysis (sICA), can be exploited to reduce the dimensionality of data sets recorded in the olfactory bulb and the somatosensory cortex of mice as well as the visual cortex of monkeys, without loosing...... latencies can be identified. This is shown for recordings of olfactory receptor neuron input measured with a calcium sensitive axon tracer and for network dynamics measured with the voltage sensitive dye RH 1838. In the somatosensory cortex, barrels responding to the stimulation of single whiskers can...... be automatically detected. In the visual cortex orientation columns can be extracted. In all cases artifacts due to movement, heartbeat or respiration were separated from the functional signal by sICA and could be removed from the data set. sICA is therefore a powerful technique for data compression, unbiased...

  10. An integrated technique for the analysis of skin bite marks.

    Science.gov (United States)

    Bernitz, Herman; Owen, Johanna H; van Heerden, Willie F P; Solheim, Tore

    2008-01-01

    The high number of murder, rape, and child abuse cases in South Africa has led to increased numbers of bite mark cases being heard in high courts. Objective analysis to match perpetrators to bite marks at crime scenes must be able to withstand vigorous cross-examination to be of value in conviction of perpetrators. An analysis technique is described in four stages, namely determination of the mark to be a human bite mark, pattern association analysis, metric analysis and comparison with the population data, and illustrated by a real case study. New and accepted techniques are combined to determine the likelihood ratio of guilt expressed as one of a range of conclusions described in the paper. Each stage of the analysis adds to the confirmation (or rejection) of concordance between the dental features present on the victim and the dentition of the suspect. The results illustrate identification to a high degree of certainty.

  11. Genetic analysis of CHARGE syndrome identifies overlapping molecular biology.

    Science.gov (United States)

    Moccia, Amanda; Srivastava, Anshika; Skidmore, Jennifer M; Bernat, John A; Wheeler, Marsha; Chong, Jessica X; Nickerson, Deborah; Bamshad, Michael; Hefner, Margaret A; Martin, Donna M; Bielas, Stephanie L

    2018-01-04

    PurposeCHARGE syndrome is an autosomal-dominant, multiple congenital anomaly condition characterized by vision and hearing loss, congenital heart disease, and malformations of craniofacial and other structures. Pathogenic variants in CHD7, encoding adenosine triphosphate-dependent chromodomain helicase DNA binding protein 7, are present in the majority of affected individuals. However, no causal variant can be found in 5-30% (depending on the cohort) of individuals with a clinical diagnosis of CHARGE syndrome.MethodsWe performed whole-exome sequencing (WES) on 28 families from which at least one individual presented with features highly suggestive of CHARGE syndrome.ResultsPathogenic variants in CHD7 were present in 15 of 28 individuals (53.6%), whereas 4 (14.3%) individuals had pathogenic variants in other genes (RERE, KMT2D, EP300, or PUF60). A variant of uncertain clinical significance in KDM6A was identified in one (3.5%) individual. The remaining eight (28.6%) individuals were not found to have pathogenic variants by WES.ConclusionThese results demonstrate that the phenotypic features of CHARGE syndrome overlap with multiple other rare single-gene syndromes. Additionally, they implicate a shared molecular pathology that disrupts epigenetic regulation of multiple-organ development.GENETICS in MEDICINE advance online publication, 4 January 2018; doi:10.1038/gim.2017.233.

  12. Immunogenicity of novel Dengue virus epitopes identified by bioinformatic analysis.

    Science.gov (United States)

    Sánchez-Burgos, Gilma; Ramos-Castañeda, José; Cedillo-Rivera, Roberto; Dumonteil, Eric

    2010-10-01

    We used T cell epitope prediction tools to identify epitopes from Dengue virus polyprotein sequences, and evaluated in vivo and in vitro the immunogenicity and antigenicity of the corresponding synthetic vaccine candidates. Twenty-two epitopes were predicted to have a high affinity for MHC class I (H-2Kd, H-2Dd, H-2Ld alleles) or class II (IAd alleles). These epitopes were conserved between the four virus serotypes, but with no similarity to human and mouse sequences. Thirteen synthetic peptides induced specific antibodies production with or without T cells activation in mice. Three synthetic peptides induced mostly IgG antibodies, and one of these from the E gene induced a neutralizing response. Ten peptides induced a combination of humoral and cellular responses by CD4+ and CD8+ T cells. Twelve peptides were novel B and T cell epitopes. These results indicate that our bioinformatics strategy is a powerful tool for the identification of novel antigens and its application to human HLA may lead to a potent epitope-based vaccine against Dengue virus and many other pathogens. (c) 2010 Elsevier B.V. All rights reserved.

  13. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  14. Demonstration of innovative techniques for work zone safety data analysis

    Science.gov (United States)

    2009-07-15

    Based upon the results of the simulator data analysis, additional future research can be : identified to validate the driving simulator in terms of similarities with Ohio work zones. For : instance, the speeds observed in the simulator were greater f...

  15. The use of nominal group technique in identifying community health priorities in Moshi rural district, northern Tanzania

    DEFF Research Database (Denmark)

    Makundi, E A; Manongi, R; Mushi, A K

    2005-01-01

    larger samples. We found a high level of agreement across groups, that malaria remains the leading health problem in Moshi rural district in Tanzania both in the highland and lowland areas. Our findings also indicate that 'non-medical' issues including lack of water, hunger and poverty heralded priority......This article highlights issues pertaining to identification of community health priorities in a resource poor setting. Community involvement is discussed by drawing experience of involving lay people in identifying priorities in health care through the use of Nominal Group Technique. The identified...... in the list implying that priorities should not only be focused on diseases, but should also include health services and social cultural issues. Indeed, methods which are easily understood and applied thus able to give results close to those provided by the burden of disease approaches should be adopted...

  16. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  17. Messina: a novel analysis tool to identify biologically relevant molecules in disease.

    Directory of Open Access Journals (Sweden)

    Mark Pinese

    Full Text Available BACKGROUND: Morphologically similar cancers display heterogeneous patterns of molecular aberrations and follow substantially different clinical courses. This diversity has become the basis for the definition of molecular phenotypes, with significant implications for therapy. Microarray or proteomic expression profiling is conventionally employed to identify disease-associated genes, however, traditional approaches for the analysis of profiling experiments may miss molecular aberrations which define biologically relevant subtypes. METHODOLOGY/PRINCIPAL FINDINGS: Here we present Messina, a method that can identify those genes that only sometimes show aberrant expression in cancer. We demonstrate with simulated data that Messina is highly sensitive and specific when used to identify genes which are aberrantly expressed in only a proportion of cancers, and compare Messina to contemporary analysis techniques. We illustrate Messina by using it to detect the aberrant expression of a gene that may play an important role in pancreatic cancer. CONCLUSIONS/SIGNIFICANCE: Messina allows the detection of genes with profiles typical of markers of molecular subtype, and complements existing methods to assist the identification of such markers. Messina is applicable to any global expression profiling data, and to allow its easy application has been packaged into a freely-available stand-alone software package.

  18. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  19. Meconium microbiome analysis identifies bacteria correlated with premature birth.

    Directory of Open Access Journals (Sweden)

    Alexandria N Ardissone

    Full Text Available Preterm birth is the second leading cause of death in children under the age of five years worldwide, but the etiology of many cases remains enigmatic. The dogma that the fetus resides in a sterile environment is being challenged by recent findings and the question has arisen whether microbes that colonize the fetus may be related to preterm birth. It has been posited that meconium reflects the in-utero microbial environment. In this study, correlations between fetal intestinal bacteria from meconium and gestational age were examined in order to suggest underlying mechanisms that may contribute to preterm birth.Meconium from 52 infants ranging in gestational age from 23 to 41 weeks was collected, the DNA extracted, and 16S rRNA analysis performed. Resulting taxa of microbes were correlated to clinical variables and also compared to previous studies of amniotic fluid and other human microbiome niches.Increased detection of bacterial 16S rRNA in meconium of infants of <33 weeks gestational age was observed. Approximately 61·1% of reads sequenced were classified to genera that have been reported in amniotic fluid. Gestational age had the largest influence on microbial community structure (R = 0·161; p = 0·029, while mode of delivery (C-section versus vaginal delivery had an effect as well (R = 0·100; p = 0·044. Enterobacter, Enterococcus, Lactobacillus, Photorhabdus, and Tannerella, were negatively correlated with gestational age and have been reported to incite inflammatory responses, suggesting a causative role in premature birth.This provides the first evidence to support the hypothesis that the fetal intestinal microbiome derived from swallowed amniotic fluid may be involved in the inflammatory response that leads to premature birth.

  20. Techniques and methodologies to identify potential generated industries of NORM in Angola Republic and evaluate its impacts

    International Nuclear Information System (INIS)

    Diogo, José Manuel Sucumula

    2017-01-01

    Numerous steps have been taken worldwide to identify and quantify the radiological risks associated with the mining of ores containing Naturally Occurrence Radioactive Material (NORM), often resulting in unnecessary exposures to individuals and high environmental damage, with devastating consequences for the health of workers and damage to the economy of many countries due to a lack of regulations or inadequate regulations. For these and other reasons, the objective of this work was to identify industrial potential generating NORM in the Republic of Angola and to estimate its radiological environmental impacts. To achieve this objective, we studied the theoretical aspects, identified the main internationally recognized industrial companies that as generate by NORM. The Brazilian experience in the regulatory aspect was observed in the evaluation criteria to classify industries that generate NORM, the methods of mining and its radiological environmental impacts, as well as the main techniques applied to evaluate the concentrations of radionuclides in a specific environmental matrix and/or a NORM sample. The study approach allowed the elaboration of a NORM map for the main provinces of Angola, establishing the evaluation criteria for implementing the Radiation Protection Plan in the extractive industry, establishing measures to control ionizing radiation in mining, identifying and quantifying radionuclides present in samples of lees oil. However, in order to assess adequately the radiological environmental impact of the NORM industry, it is not enough to identify them, it is important to know the origin, quantify the radioactive material released as liquid and gaseous effluents, identify the main routes of exposure and examine how this material spreads into the environment until it reaches man. (author)

  1. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  2. Multidimensional scaling technique for analysis of magnetic storms ...

    Indian Academy of Sciences (India)

    R.Narasimhan(krishtel emaging) 1461 1996 Oct 15 13:05:22

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependence of geomagnetic field ..... at best an approximation of the real situation but still it may contain a surprising amount of useful .... (oscillations) is a function of latitude and local time. Close to the dip equator just south of Trivan-.

  3. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  4. Study and analysis of wavelet based image compression techniques ...

    African Journals Online (AJOL)

    This paper presented comprehensive study with performance analysis of very recent Wavelet transform based image compression techniques. Image compression is one of the necessities for such communication. The goals of image compression are to minimize the storage requirement and communication bandwidth.

  5. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  6. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  7. Metric Distance Ranking Technique for Fuzzy Critical Path Analysis ...

    African Journals Online (AJOL)

    In this paper, fuzzy critical path analysis of a project network is carried out. Metric distance ranking technique is used to order fuzzy numbers during the forward and backward pass computations to obtain the earliest start, earliest finish, latest start and latest finish times of the project's activities. A numerical example is ...

  8. Technologies and microstructures for separation techniques in chemical analysis

    NARCIS (Netherlands)

    Spiering, Vincent L.; Spiering, V.L.; Lammerink, Theodorus S.J.; Jansen, Henricus V.; van den Berg, Albert; Fluitman, J.H.J.

    1996-01-01

    The possibilities for microtechnology in chemical analysis and separation techniques are discussed. The combination of the materials and the dimensions of structures can limit the sample and waste volumes on the one hand, but also increases the performance of the chemical systems. Especially in high

  9. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  10. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  11. Mini-DIAL system measurements coupled with multivariate data analysis to identify TIC and TIM simulants: preliminary absorption database analysis.

    Science.gov (United States)

    Gaudio, P.; Malizia, A.; Gelfusa, M.; Martinelli, E.; Di Natale, C.; Poggi, L. A.; Bellecci, C.

    2017-01-01

    Nowadays Toxic Industrial Components (TICs) and Toxic Industrial Materials (TIMs) are one of the most dangerous and diffuse vehicle of contamination in urban and industrial areas. The academic world together with the industrial and military one are working on innovative solutions to monitor the diffusion in atmosphere of such pollutants. In this phase the most common commercial sensors are based on “point detection” technology but it is clear that such instruments cannot satisfy the needs of the smart cities. The new challenge is developing stand-off systems to continuously monitor the atmosphere. Quantum Electronics and Plasma Physics (QEP) research group has a long experience in laser system development and has built two demonstrators based on DIAL (Differential Absorption of Light) technology could be able to identify chemical agents in atmosphere. In this work the authors will present one of those DIAL system, the miniaturized one, together with the preliminary results of an experimental campaign conducted on TICs and TIMs simulants in cell with aim of use the absorption database for the further atmospheric an analysis using the same DIAL system. The experimental results are analysed with standard multivariate data analysis technique as Principal Component Analysis (PCA) to develop a classification model aimed at identifying organic chemical compound in atmosphere. The preliminary results of absorption coefficients of some chemical compound are shown together pre PCA analysis.

  12. Mini-DIAL system measurements coupled with multivariate data analysis to identify TIC and TIM simulants: preliminary absorption database analysis

    International Nuclear Information System (INIS)

    Gaudio, P; Malizia, A; Gelfusa, M; Poggi, L.A.; Martinelli, E.; Di Natale, C.; Bellecci, C.

    2017-01-01

    Nowadays Toxic Industrial Components (TICs) and Toxic Industrial Materials (TIMs) are one of the most dangerous and diffuse vehicle of contamination in urban and industrial areas. The academic world together with the industrial and military one are working on innovative solutions to monitor the diffusion in atmosphere of such pollutants. In this phase the most common commercial sensors are based on “point detection” technology but it is clear that such instruments cannot satisfy the needs of the smart cities. The new challenge is developing stand-off systems to continuously monitor the atmosphere. Quantum Electronics and Plasma Physics (QEP) research group has a long experience in laser system development and has built two demonstrators based on DIAL (Differential Absorption of Light) technology could be able to identify chemical agents in atmosphere. In this work the authors will present one of those DIAL system, the miniaturized one, together with the preliminary results of an experimental campaign conducted on TICs and TIMs simulants in cell with aim of use the absorption database for the further atmospheric an analysis using the same DIAL system. The experimental results are analysed with standard multivariate data analysis technique as Principal Component Analysis (PCA) to develop a classification model aimed at identifying organic chemical compound in atmosphere. The preliminary results of absorption coefficients of some chemical compound are shown together pre PCA analysis. (paper)

  13. Clinical education and training: Using the nominal group technique in research with radiographers to identify factors affecting quality and capacity

    International Nuclear Information System (INIS)

    Williams, P.L.; White, N.; Klem, R.; Wilson, S.E.; Bartholomew, P.

    2006-01-01

    There are a number of group-based research techniques available to determine the views or perceptions of individuals in relation to specific topics. This paper reports on one method, the nominal group technique (NGT) which was used to collect the views of important stakeholders on the factors affecting the quality of, and capacity to provide clinical education and training in diagnostic imaging and radiotherapy and oncology departments in the UK. Inclusion criteria were devised to recruit learners, educators, practitioners and service managers to the nominal groups. Eight regional groups comprising a total of 92 individuals were enrolled; the numbers in each group varied between 9 and 13. A total of 131 items (factors) were generated across the groups (mean = 16.4). Each group was then asked to select the top three factors from their original list. Consensus on the important factors amongst groups found that all eight groups agreed on one item: staff attitude, motivation and commitment to learners. The 131 items were organised into themes using content analysis. Five main categories and a number of subcategories emerged. The study concluded that the NGT provided data which were congruent with the issues faced by practitioners and learners in their daily work; this was of vital importance if the findings are to be regarded with credibility. Further advantages and limitations of the method are discussed, however it is argued that the NGT is a useful technique to gather relevant opinion; to select priorities and to reach consensus on a wide range of issues

  14. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  15. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  16. Using a behaviour change techniques taxonomy to identify active ingredients within trials of implementation interventions for diabetes care.

    Science.gov (United States)

    Presseau, Justin; Ivers, Noah M; Newham, James J; Knittle, Keegan; Danko, Kristin J; Grimshaw, Jeremy M

    2015-04-23

    Methodological guidelines for intervention reporting emphasise describing intervention content in detail. Despite this, systematic reviews of quality improvement (QI) implementation interventions continue to be limited by a lack of clarity and detail regarding the intervention content being evaluated. We aimed to apply the recently developed Behaviour Change Techniques Taxonomy version 1 (BCTTv1) to trials of implementation interventions for managing diabetes to assess the capacity and utility of this taxonomy for characterising active ingredients. Three psychologists independently coded a random sample of 23 trials of healthcare system, provider- and/or patient-focused implementation interventions from a systematic review that included 142 such studies. Intervention content was coded using the BCTTv1, which describes 93 behaviour change techniques (BCTs) grouped within 16 categories. We supplemented the generic coding instructions within the BCTTv1 with decision rules and examples from this literature. Less than a quarter of possible BCTs within the BCTTv1 were identified. For implementation interventions targeting providers, the most commonly identified BCTs included the following: adding objects to the environment, prompts/cues, instruction on how to perform the behaviour, credible source, goal setting (outcome), feedback on outcome of behaviour, and social support (practical). For implementation interventions also targeting patients, the most commonly identified BCTs included the following: prompts/cues, instruction on how to perform the behaviour, information about health consequences, restructuring the social environment, adding objects to the environment, social support (practical), and goal setting (behaviour). The BCTTv1 mapped well onto implementation interventions directly targeting clinicians and patients and could also be used to examine the impact of system-level interventions on clinician and patient behaviour. The BCTTv1 can be used to characterise

  17. FINE-GRAINEDCELLULAR CONCRETE CREEP ANALYSIS TECHNIQUE WITH CONSIDERATION FORCARBONATION

    Directory of Open Access Journals (Sweden)

    M. A. Gaziev

    2015-01-01

    Full Text Available The article considers the creep and creep deformation analysis technique in fine-grainedcellular concrete with consideration for carbonation and assurance requirements for the repairing properties and seismic stability. The procedure for determining the creep of fine-grainedcellular concrete is proposed with account of its carbonationby atmospheric carbon dioxide. It has been found theoretically and experimentally that the proposed technique allows obtaining reproducible results and can be recommended for creep determination of fine-grainedcellular concretes, including repairingones, taking into account their carbonation.

  18. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  19. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  20. Evaluation of Damping Using Frequency Domain Operational Modal Analysis Techniques

    DEFF Research Database (Denmark)

    Bajric, Anela; Georgakis, Christos T.; Brincker, Rune

    2015-01-01

    separated and closely spaced modes. Finally, the results of the numerical study are presented, in which the error of the structural damping estimates obtained by each OMA technique is shown for a range of damping levels. From this, it is clear that there are notable differences in accuracy between......Operational Modal Analysis (OMA) techniques provide in most cases reasonably accurate estimates of structural frequencies and mode shapes. In contrast though, they are known to often produce uncertain structural damping estimates, which is mainly due to inherent random and/or bias errors...

  1. Analysis of Intra-Urban Traffic Accidents Using Spatiotemporal Visualization Techniques

    OpenAIRE

    Soltani Ali; Askari Sajad

    2014-01-01

    Road traffic accidents (RTAs) rank in the top ten causes of the global burden of disease and injury, and Iran has one of the highest road traffic mortality rates in the world. This paper presents a spatiotemporal analysis of intra-urban traffic accidents data in metropolitan Shiraz, Iran during the period 2011-2012. It is tried to identify the accident prone zones and sensitive hours using Geographic Information Systems (GIS)-based spatio-temporal visualization techniques. The analysis aimed ...

  2. What's down below? Current and potential future applications of geophysical techniques to identify subsurface permafrost conditions (Invited)

    Science.gov (United States)

    Douglas, T. A.; Bjella, K.; Campbell, S. W.

    2013-12-01

    For infrastructure design, operations, and maintenance requirements in the North the ability to accurately and efficiently detect the presence (or absence) of ground ice in permafrost terrains is a serious challenge. Ground ice features including ice wedges, thermokarst cave-ice, and segregation ice are present in a variety of spatial scales and patterns. Currently, most engineering applications use borehole logging and sampling to extrapolate conditions at the point scale. However, there is high risk of over or under estimating the presence of frozen or unfrozen features when relying on borehole information alone. In addition, boreholes are costly, especially for planning linear structures like roads or runways. Predicted climate warming will provide further challenges for infrastructure development and transportation operations where permafrost degradation occurs. Accurately identifying the subsurface character in permafrost terrains will allow engineers and planners to cost effectively create novel infrastructure designs to withstand the changing environment. There is thus a great need for a low cost rapidly deployable, spatially extensive means of 'measuring' subsurface conditions. Geophysical measurements, both terrestrial and airborne, have strong potential to revolutionize our way of mapping subsurface conditions. Many studies in continuous and discontinuous permafrost have used geophysical measurements to identify discrete features and repeatable patterns in the subsurface. The most common measurements include galvanic and capacitive coupled resistivity, ground penetrating radar, and multi frequency electromagnetic induction techniques. Each of these measurements has strengths, weaknesses, and limitations. By combining horizontal geophysical measurements, downhole geophysics, multispectral remote sensing images, LiDAR measurements, and soil and vegetation mapping we can start to assemble a holistic view of how surface conditions and standoff measurements

  3. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  4. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  5. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  6. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  7. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  8. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 1: Review and comparison of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked

  9. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  10. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    the methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between......This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group...

  11. Application of cluster analysis to geochemical compositional data for identifying ore-related geochemical anomalies

    Science.gov (United States)

    Zhou, Shuguang; Zhou, Kefa; Wang, Jinlin; Yang, Genfang; Wang, Shanshan

    2017-12-01

    Cluster analysis is a well-known technique that is used to analyze various types of data. In this study, cluster analysis is applied to geochemical data that describe 1444 stream sediment samples collected in northwestern Xinjiang with a sample spacing of approximately 2 km. Three algorithms (the hierarchical, k-means, and fuzzy c-means algorithms) and six data transformation methods (the z-score standardization, ZST; the logarithmic transformation, LT; the additive log-ratio transformation, ALT; the centered log-ratio transformation, CLT; the isometric log-ratio transformation, ILT; and no transformation, NT) are compared in terms of their effects on the cluster analysis of the geochemical compositional data. The study shows that, on the one hand, the ZST does not affect the results of column- or variable-based (R-type) cluster analysis, whereas the other methods, including the LT, the ALT, and the CLT, have substantial effects on the results. On the other hand, the results of the row- or observation-based (Q-type) cluster analysis obtained from the geochemical data after applying NT and the ZST are relatively poor. However, we derive some improved results from the geochemical data after applying the CLT, the ILT, the LT, and the ALT. Moreover, the k-means and fuzzy c-means clustering algorithms are more reliable than the hierarchical algorithm when they are used to cluster the geochemical data. We apply cluster analysis to the geochemical data to explore for Au deposits within the study area, and we obtain a good correlation between the results retrieved by combining the CLT or the ILT with the k-means or fuzzy c-means algorithms and the potential zones of Au mineralization. Therefore, we suggest that the combination of the CLT or the ILT with the k-means or fuzzy c-means algorithms is an effective tool to identify potential zones of mineralization from geochemical data.

  12. Contributions to flow techniques and mass spectrometry in water analysis

    OpenAIRE

    Santos, Inês Carvalho dos

    2015-01-01

    In this thesis, the use of different flow systems was exploited along with the use of different detection techniques for the development of simple, robust, and automated analytical procedures. With the purpose to perform in-line sample handling and pretreatment operations, different separation units were used. The main target for these methods was waters samples. The first procedure was based on a sequential injection analysis (SIA) system for carbon speciation (alkalinity, dis...

  13. Analysis of Indian silver coins by EDXRF technique

    International Nuclear Information System (INIS)

    Tripathy, B.B.; Rautray, T.R.; Das, Satya R.; Das, Manas R.; Vijayan, V.

    2009-01-01

    The analysis of some of the Indian silver coins during British rule were analysed by Energy Dispersive X-Ray Fluorescence Technique. Eight elements namely Cr, Fe, Ni, Cu, Zn, As, Ag and Pb were estimated in this study which also seems to indicate the fragmentation as well as the impoverishment of the power for the regimes that had produced the studied coins. While Cu and Ag were present as major elements, other elements were found to be present in minor concentration. (author)

  14. Analysis of Self-Excited Combustion Instabilities Using Decomposition Techniques

    Science.gov (United States)

    2016-07-05

    and simulation (see Fig. 2). Frequency Fig. 1 LDI computational domain used for decomposition analysis. 2792 HUANG ETAL. D ow nl oa de d by A ir F...combustors. Since each proper orthogonal decomposition mode comprises multiple frequencies , specific modes of the pressure and heat release are not related...qualitative and less efficient for identifying physical mechanisms. On the other hand, dynamic mode decomposition analysis generates a global frequency

  15. Identifying and Prioritizing Effective Factors on Classifying A Private Bank Customers by Delphi Technique and Analytical Hierarchy Process (AHP

    Directory of Open Access Journals (Sweden)

    S. Khayatmoghadam

    2013-05-01

    Full Text Available Banking industry development and presence of different financial institutions cause to increase competition in customer and their capitals attraction so that there are about 28 banks and many credit and financial institutions from which 6 banks are public and 22 banks are private. Among them, public banks have a more appropriate situation than private banks with regard to governmental relations and support and due to geographical expansion and longer history. But due to lack of above conditions; private banks try to attract customers with regarding science areas to remedy this situation. Therefore, in this study we are decided to review banking customers from a different viewpoint. For this reason, we initially obtained ideal indications from banking viewpoint in two-story of uses and resources customers using experts and Delphi technique application which based on this, indicators such as account workflow, account average, lack of returned cheque, etc and in uses section, the amount of facility received, the amount of received warranties, etc, were determined. Then, using a Hierarchical Analysis (AHP method and experts opinions through software Expert Choice11, priority of these criteria were determined and weight of each index was determined. It should be noted that statistical population of bank experts associated with this study were queue and staff. Also obtained results can be used as input for customer grouping in line with CRM techniques implementation.

  16. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... and simplifcation of the design of practical vision systems....... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...

  17. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  18. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    Science.gov (United States)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  19. Feature selection and classification for microarray data analysis: Evolutionary methods for identifying predictive genes

    Directory of Open Access Journals (Sweden)

    Aitken Stuart

    2005-06-01

    Full Text Available Abstract Background In the clinical context, samples assayed by microarray are often classified by cell line or tumour type and it is of interest to discover a set of genes that can be used as class predictors. The leukemia dataset of Golub et al. 1 and the NCI60 dataset of Ross et al. 2 present multiclass classification problems where three tumour types and nine cell lines respectively must be identified. We apply an evolutionary algorithm to identify the near-optimal set of predictive genes that classify the data. We also examine the initial gene selection step whereby the most informative genes are selected from the genes assayed. Results In the absence of feature selection, classification accuracy on the training data is typically good, but not replicated on the testing data. Gene selection using the RankGene software 3 is shown to significantly improve performance on the testing data. Further, we show that the choice of feature selection criteria can have a significant effect on accuracy. The evolutionary algorithm is shown to perform stably across the space of possible parameter settings – indicating the robustness of the approach. We assess performance using a low variance estimation technique, and present an analysis of the genes most often selected as predictors. Conclusion The computational methods we have developed perform robustly and accurately, and yield results in accord with clinical knowledge: A Z-score analysis of the genes most frequently selected identifies genes known to discriminate AML and Pre-T ALL leukemia. This study also confirms that significantly different sets of genes are found to be most discriminatory as the sample classes are refined.

  20. Identify the Effective Wells in Determination of Groundwater Depth in Urmia Plain Using Principle Component Analysis

    Directory of Open Access Journals (Sweden)

    Sahar Babaei Hessar

    2017-06-01

    Full Text Available Introduction: Groundwater is the most important resource of providing sanitary water for potable and household consumption. So continuous monitoring of groundwater level will play an important role in water resource management. But because of the large amount of information, evaluation of water table is a costly and time consuming process. Therefore, in many studies, the data and information aren’t suitable and useful and so, must be neglected. The PCA technique is an optimized mathematical method that reserve data with the highest share in affirming variance with recognizing less important data and limits the original variables into to a few components. In this technique, variation factors called principle components are identified with considering data structures. Thus, variables those have the highest correlation coefficient with principal components are extracted as a result of identifying the components that create the greatest variance. Materials and Methods: The study region has an area of approximately 962 Km2 and area located between 37º 21´ N to 37º 49´ N and 44º 57´ E to 45º 16´ E in West Azerbaijan province of Iran. This area placed along the mountainous north-west of the country, which ends with the plane Urmia Lake and has vast groundwater resources. However, recently the water table has been reduced considerably because of the exceeded exploitation as a result of urbanization and increased agricultural and horticultural land uses. In the present study, the annual water table datasets in 51wells monitored by Ministry of Energy during statistical periods of 2002-2011 were used to data analysis. In order to identify the effective wells in determination of groundwater level, the PCA technique was used. In this research to compute the relative importance of each well, 10 wells were identified with the nearest neighbor for each one. The number of wells (p as a general rule must be less or equal to the maximum number of

  1. Application of Microfluidic Techniques to Pyrochemical Salt Sampling and Analysis

    International Nuclear Information System (INIS)

    Pereira, C.; Launiere, C.; Smith, N.

    2015-01-01

    Microfluidic techniques enable production of micro-samples of molten salt for analysis by at-line and off-line sensors and detectors. These sampling systems are intended for implementation in an electrochemical used fuel treatment facility as part of the material balance and control system. Microfluidics may reduce random statistical error associated with sampling inhomogeneity because a large number of uniform sub-microlitre droplets may be generated and successively analyzed. The approach combines two immiscible fluids in a microchannel under laminar flow conditions to generate slug flows. Because the slug flow regime is characterized by regularly sized and spaced droplets, it is commonly used in low-volume/high-throughput assays of aqueous and organic phases. This scheme is now being applied to high-temperature molten salts in combination with a second fluid that is stable at elevated temperatures. The microchip systems are being tested to determine the channel geometries and absolute and relative phase flow rates required to achieve stable slug flow. Because imaging is difficult at the 5000 C process temperatures the fluorescence of salt ions under ultraviolet illumination is used to discern flow regimes. As molten chloride melts are optically transparent, UV-visible light spectroscopy is also being explored as a spectroscopic technique for integration with at-line microchannel systems to overcome some of the current challenges to in situ analysis. A second technique that is amenable to droplet analysis is Laser-induced Breakdown Spectroscopy (LIBS). A pneumatic droplet generator is being interfaced with a LIBS system for analysis of molten salts at near-process temperatures. Tests of the pneumatic generator are being run using water and molten salts, and in tandem with off-line analysis of the salt droplets with a LIBS spectrometer. (author)

  2. Application of unsupervised analysis techniques to lung cancer patient data.

    Science.gov (United States)

    Lynch, Chip M; van Berkel, Victor H; Frieboes, Hermann B

    2017-01-01

    This study applies unsupervised machine learning techniques for classification and clustering to a collection of descriptive variables from 10,442 lung cancer patient records in the Surveillance, Epidemiology, and End Results (SEER) program database. The goal is to automatically classify lung cancer patients into groups based on clinically measurable disease-specific variables in order to estimate survival. Variables selected as inputs for machine learning include Number of Primaries, Age, Grade, Tumor Size, Stage, and TNM, which are numeric or can readily be converted to numeric type. Minimal up-front processing of the data enables exploring the out-of-the-box capabilities of established unsupervised learning techniques, with little human intervention through the entire process. The output of the techniques is used to predict survival time, with the efficacy of the prediction representing a proxy for the usefulness of the classification. A basic single variable linear regression against each unsupervised output is applied, and the associated Root Mean Squared Error (RMSE) value is calculated as a metric to compare between the outputs. The results show that self-ordering maps exhibit the best performance, while k-Means performs the best of the simpler classification techniques. Predicting against the full data set, it is found that their respective RMSE values (15.591 for self-ordering maps and 16.193 for k-Means) are comparable to supervised regression techniques, such as Gradient Boosting Machine (RMSE of 15.048). We conclude that unsupervised data analysis techniques may be of use to classify patients by defining the classes as effective proxies for survival prediction.

  3. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 2: robustness of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples

  4. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  5. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori......The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done...

  6. Dependency Coefficient in Computerized GALS Examination Utilizing Motion Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Hamed Shahidian

    2013-04-01

    Full Text Available Objectives: The GALS (Gait, Arms, Legs and Spine examination is a compact version of standard procedures used by rheumatologists to determine musculoskeletal disorders in patients. Computerization of such a clinical procedure is necessary to ensure an objective evaluation. This article presents the first steps in such an approach by outlining a procedure to use motion analysis techniques as a new method for GALS examination. Methods: A 3D motion pattern was obtained from two subject groups using a six camera motion analysis system. The range of motion associated with GALS was consequently determined using a MATLAB program. Results: The range of motion (ROM of the two subject groups was determined, the validity of the approach was outlined, and the symmetry of movement on both sides of the body was quantified through introduction of a dependency coefficient. Discussion: Analysis of GALS examination and diagnosis of musculoskeletal problems could be addressed more accurately and reliably by adopting motion analysis techniques. Furthermore, introduction of a dependency coefficient offers a wide spectrum of prospective applications in neuromuscular studies .

  7. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  8. Analysis techniques for two-dimensional infrared data

    Science.gov (United States)

    Winter, E. M.; Smith, M. C.

    1978-01-01

    In order to evaluate infrared detection and remote sensing systems, it is necessary to know the characteristics of the observational environment. For both scanning and staring sensors, the spatial characteristics of the background may be more of a limitation to the performance of a remote sensor than system noise. This limitation is the so-called spatial clutter limit and may be important for systems design of many earth application and surveillance sensors. The data used in this study is two dimensional radiometric data obtained as part of the continuing NASA remote sensing programs. Typical data sources are the Landsat multi-spectral scanner (1.1 micrometers), the airborne heat capacity mapping radiometer (10.5 - 12.5 micrometers) and various infrared data sets acquired by low altitude aircraft. Techniques used for the statistical analysis of one dimensional infrared data, such as power spectral density (PSD), exceedance statistics, etc. are investigated for two dimensional applicability. Also treated are two dimensional extensions of these techniques (2D PSD, etc.), and special techniques developed for the analysis of 2D data.

  9. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.

    2002-01-01

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  10. Development of a computerized method for identifying the posteroanterior and lateral views of chest radiographs by use of a template matching technique

    International Nuclear Information System (INIS)

    Arimura, Hidetaka; Katsuragawa, Shigehiko; Li Qiang; Ishida, Takayuki; Doi, Kunio

    2002-01-01

    In picture archiving and communications systems (PACS) or digital archiving systems, the information on the posteroanterior (PA) and lateral views for chest radiographs is often not recorded or is recorded incorrectly. However, it is necessary to identify the PA or lateral view correctly and automatically for quantitative analysis of chest images for computer-aided diagnosis. Our purpose in this study was to develop a computerized method for correctly identifying either PA or lateral views of chest radiographs. Our approach is to examine the similarity of a chest image with templates that represent the average chest images of the PA or lateral view for various types of patients. By use of a template matching technique with nine template images for patients of different size in two steps, correlation values were obtained for determining whether a chest image is either a PA or a lateral view. The templates for PA and lateral views were prepared from 447 PA and 200 lateral chest images. For a validation test, this scheme was applied to 1,000 test images consisting of 500 PA and 500 lateral chest radiographs, which are different from training cases. In the first step, 924 (92.4%) of the cases were correctly identified by comparison of the correlation values obtained with the three templates for medium-size patients. In the second step, the correlation values with the six templates for small and large patients were compared, and all of the remaining unidentifiable cases were identified correctly

  11. Transformation from student to occupational therapist: Using the Delphi technique to identify the threshold concepts of occupational therapy.

    Science.gov (United States)

    Nicola-Richmond, Kelli M; Pépin, Geneviève; Larkin, Helen

    2016-04-01

    Understanding and facilitating the transformation from occupational therapy student to practitioner is central to the development of competent and work-ready graduates. However, the pivotal concepts and capabilities that need to be taught and learnt in occupational therapy are not necessarily explicit. The threshold concepts theory of teaching and learning proposes that every discipline has a set of transformational concepts that students must acquire in order to progress. As students acquire the threshold concepts, they develop a transformed way of understanding content related to their course of study which contributes to their developing expertise. The aim of this study was to identify the threshold concepts of occupational therapy. The Delphi technique, a data collection method that aims to demonstrate consensus in relation to important questions, was used with three groups comprising final year occupational therapy students (n = 11), occupational therapy clinicians (n = 21) and academics teaching occupational therapy (n = 10) in Victoria, Australia. Participants reached consensus regarding 10 threshold concepts for the occupational therapy discipline. These are: understanding and applying the models and theories of occupational therapy; occupation; evidence-based practice; clinical reasoning; discipline specific skills and knowledge; practising in context; a client-centred approach; the occupational therapist role; reflective practice and; a holistic approach. The threshold concepts identified provide valuable information for the discipline. They can potentially inform the development of competencies for occupational therapy and provide guidance for teaching and learning activities to facilitate the transformation to competent practitioner. © 2015 Occupational Therapy Australia.

  12. Impact during equine locomotion: techniques for measurement and analysis.

    Science.gov (United States)

    Burn, J F; Wilson, A; Nason, G P

    1997-05-01

    Impact is implicated in the development of several types of musculoskeletal injury in the horse. Characterisation of impact experienced during strenuous exercise is an important first step towards understanding the mechanism for injury. Measurement and analysis of large, short duration impacts is difficult. The measurement system must be able to record transient peaks and high frequencies accurately. The analysis technique must be able to characterise the impact signal in time and frequency. This paper presents a measurement system and analysis technique for the characterisation of large impacts. A piezo-electric accelerometer was securely mounted on the dorsal surface of the horses hoof. Saddle mounted charge amplifiers and a 20 m coaxial cable transferred these data to a PC based logging system. Data were down-loaded onto a UNIX workstation and analysed using a proprietary statistics package. The values of parameters calculated from the time series data were comparable to those of other authors. A wavelet decomposition showed that the frequency profile of the signal changed with time. While most spectral energy was seen at impact, a significant amount of energy was contained in the signal immediately following impact. Over 99% of this energy was contained in frequencies less than 1250 Hz. The sampling rate and the frequency response of a measurement system for recording impact should be chosen carefully to prevent loss or corruption of data. Time scale analysis using a wavelet decomposition is a powerful technique which can be used to characterise impact data. The use of contour plots provides a highly visual representation of the time and frequency localisation of power during impact.

  13. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  14. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  15. An optimized InCell Western screening technique identifies hexachlorophene as a novel potent TDP43 targeting drug.

    Science.gov (United States)

    Narayan, Malathi; Peralta, Diego A; Gibson, Chelsea; Zitnyar, Ashley; Jinwal, Umesh K

    2015-08-10

    TAR DNA binding protein (TDP43) is a DNA- and RNA-binding protein that is implicated in several neurodegenerative disorders termed as "TDP43 proteinopathies" including Alzheimer's disease (AD), amyotrophic lateral sclerosis (ALS) and fronto-temporal lobe dementia (FTLD). We have developed an InCell Western (ICW) technique for screening TDP targeting drugs in 96 well plates. We tested 281 compounds and identified a novel compound hexachlorophene (referred to as B10) that showed potent reduction in TDP43 levels. The effect of B10 on TDP protein level was validated in two different cellular models: endogenous TDP43 expressing N9 microglial cells and TDP43-over-expressing HEK293 and HeLa cells. We also analyzed effect of B10 on various pathological forms of TDP such as the C25 cleaved fragment that localizes to the cytosol, insoluble high molecular weight species, and ALS-linked mutants. Our data suggest that B10 effectively reduces all forms of TDP. Overall, our data suggest that B10 could serve as a potential drug molecule for the treatment of AD, ALS and other TDP43 proteinopathies. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. An Electrochemical Impedance Spectroscopy-Based Technique to Identify and Quantify Fermentable Sugars in Pineapple Waste Valorization for Bioethanol Production

    Directory of Open Access Journals (Sweden)

    Claudia Conesa

    2015-09-01

    Full Text Available Electrochemical Impedance Spectroscopy (EIS has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%. These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring.

  17. Analysis of diatomaceous earth by x-ray fluorescence techniques

    International Nuclear Information System (INIS)

    Parker, J.

    1985-01-01

    The use of diatomaceous earth in industry as filtering aids, mineral fillers, catalyst carriers, chromatographic supports, and paint additives is well documented. The diatomite matrix is well suited to x-ray analysis, but this application has not been cited in the literature. In our laboratory, x-ray fluorescence spectrometry has been used to support the analytical needs of diatomite product development. Lithium borate fusion and pressed powder techniques have been used to determine major, minor, and trace elements in diatomite and synthetic silicate samples. Conventional matrix correction models and fundamental parameters have been used to reduce x-ray measurements to accurate chemical analyses. Described are sample and standard preparation techniques, data reduction methods, applications, and results

  18. Some problems of calibration technique in charged particle activation analysis

    International Nuclear Information System (INIS)

    Krasnov, N.N.; Zatolokin, B.V.; Konstantinov, I.O.

    1977-01-01

    It is shown that three different approaches to calibration technique based on the use of average cross-section, equivalent target thickness and thick target yield are adequate. Using the concept of thick target yield, a convenient charged particle activation equation is obtained. The possibility of simultaneous determination of two impurities, from which the same isotope is formed, is pointed out. The use of the concept of thick target yield facilitates the derivation of a simple formula for an absolute and comparative methods of analysis. The methodical error does not exceed 10%. Calibration technique and determination of expected sensitivity based on the thick target yield concept is also very convenient because experimental determination of thick target yield values is a much simpler procedure than getting activation curve or excitation function. (T.G.)

  19. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  20. Single Particle Tracking: Analysis Techniques for Live Cell Nanoscopy

    Science.gov (United States)

    Relich, Peter Kristopher, II

    Single molecule experiments are a set of experiments designed specifically to study the properties of individual molecules. It has only been in the last three decades where single molecule experiments have been applied to the life sciences; where they have been successfully implemented in systems biology for probing the behaviors of sub-cellular mechanisms. The advent and growth of super-resolution techniques in single molecule experiments has made the fundamental behaviors of light and the associated nano-probes a necessary concern amongst life scientists wishing to advance the state of human knowledge in biology. This dissertation disseminates some of the practices learned in experimental live cell microscopy. The topic of single particle tracking is addressed here in a format that is designed for the physicist who embarks upon single molecule studies. Specifically, the focus is on the necessary procedures to generate single particle tracking analysis techniques that can be implemented to answer biological questions. These analysis techniques range from designing and testing a particle tracking algorithm to inferring model parameters once an image has been processed. The intellectual contributions of the author include the techniques in diffusion estimation, localization filtering, and trajectory associations for tracking which will all be discussed in detail in later chapters. The author of this thesis has also contributed to the software development of automated gain calibration, live cell particle simulations, and various single particle tracking packages. Future work includes further evaluation of this laboratory's single particle tracking software, entropy based approaches towards hypothesis validations, and the uncertainty quantification of gain calibration.

  1. Analysis of Lipoasiprated Following Centrifugation: Wet Versus Dry Harvesting Technique.

    Science.gov (United States)

    Agostini, Tommaso; Spinelli, Giuseppe; Perello, Raffella; Bani, Daniele; Boccalini, Giulia

    2016-09-01

    The success of lipotransfer strongly depends on the harvesting, processing, and placement of the lipoaspirated samples. This study was designed to assess the histomorphometric characteristics and viability of fat harvested using different techniques (wet and dry) following centrifugation, as described by Coleman. The study enrolled 85 consecutive, nonrandomized, healthy patients from March 2010 to December 2014 (45 males and 40 females). The mean age was 40 years (range, 18-59 years), and the mean body mass index was 25.8 (range, 24-32). The authors performed a histological analysis (hematoxylin/eosin), morphometry (ImageJ 1.33 free-share image analysis software), and a viability assessment (Trypan Blue exclusion test; Sigma-Aldrich, Milan, Italy) of the lipoaspirated samples. The hematoxylin and eosin-stained sections exhibited similar features; in particular, clear-cut morphological signs of adipocyte disruption, apoptosis, or necrosis were not detected in the examined samples. Morphometry confirmed the visual findings, and the values of the mean surface area of the adipocyte vacuoles were not significantly different. Additionally, the adipocyte viability was not significantly different in the analyzed fat tissue samples. The results from this study showed, for the first time, that there is not a reduction in the viability of fat grafts harvested with the dry or wet technique following centrifugation according to Coleman technique. Both methods of fat harvesting collect viable cells, which are not influenced by standard centrifugation. The fat grafts harvested and processed by this technique could be used in clinical settings without increasing the reabsorption rate. V.

  2. Generic Meal Patterns Identified by Latent Class Analysis: Insights from NANS (National Adult Nutrition Survey

    Directory of Open Access Journals (Sweden)

    Irina Uzhova

    2018-03-01

    Full Text Available Nutritional data reduction methods are widely applied in nutrition epidemiology in order to classify individuals into meaningful groups with similar dietary patterns. To date, none of the existing studies have applied latent class analysis to examine dietary patterns which include meal types consumed throughout a day. We investigated main meal patterns followed on weekend and weekdays, and evaluated their associations with cardio-metabolic biomarkers. The analyses were performed within the NANS (National Adult Nutrition Survey a cross-sectional national food consumption survey of 1500 nationally representative Irish adults. A total number of seven dietary patterns were identified using latent class analysis. The typical meal pattern followed by the majority of the population was characterized by consumption of cereal or toast for breakfast, skipping or consuming a sandwich for light meal, and meat or fish with potatoes, pasta or vegetables for the main meal. Eating patterns differed on weekends, and those participants who consumed meat and eggs for breakfast instead of breakfast cereal and skipped light meal were more likely to have an unhealthier dietary pattern, a higher diastolic blood pressure, and increased serum ferritin. The application of data reduction techniques to simplify the multifaceted nature of dietary data is a useful approach to derive patterns, which might shed further light on the typical dietary patterns followed by populations.

  3. Generic Meal Patterns Identified by Latent Class Analysis: Insights from NANS (National Adult Nutrition Survey).

    Science.gov (United States)

    Uzhova, Irina; Woolhead, Clara; Timon, Claire M; O'Sullivan, Aifric; Brennan, Lorraine; Peñalvo, José L; Gibney, Eileen R

    2018-03-06

    Nutritional data reduction methods are widely applied in nutrition epidemiology in order to classify individuals into meaningful groups with similar dietary patterns. To date, none of the existing studies have applied latent class analysis to examine dietary patterns which include meal types consumed throughout a day. We investigated main meal patterns followed on weekend and weekdays, and evaluated their associations with cardio-metabolic biomarkers. The analyses were performed within the NANS (National Adult Nutrition Survey) a cross-sectional national food consumption survey of 1500 nationally representative Irish adults. A total number of seven dietary patterns were identified using latent class analysis. The typical meal pattern followed by the majority of the population was characterized by consumption of cereal or toast for breakfast, skipping or consuming a sandwich for light meal, and meat or fish with potatoes, pasta or vegetables for the main meal. Eating patterns differed on weekends, and those participants who consumed meat and eggs for breakfast instead of breakfast cereal and skipped light meal were more likely to have an unhealthier dietary pattern, a higher diastolic blood pressure, and increased serum ferritin. The application of data reduction techniques to simplify the multifaceted nature of dietary data is a useful approach to derive patterns, which might shed further light on the typical dietary patterns followed by populations.

  4. Identifying the Role of National Digital Cadastral Database (ndcdb) in Malaysia and for Land-Based Analysis

    Science.gov (United States)

    Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Yusof, O. M.; Wazir, M. A. M.; Adimin, M. K.

    2017-10-01

    This paper explains the process carried out in identifying the significant role of NDCDB in Malaysia specifically in the land-based analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the role standpoint. Seven statements pertaining the significant role of NDCDB in Malaysia and land-based analysis were established after three rounds of consensus building. The agreed statements provided a clear definition to describe the important role of NDCDB in Malaysia and for land-based analysis, which was limitedly studied that lead to unclear perception to the general public and even the geospatial community. The connection of the statements with disaster management is discussed concisely at the end of the research.

  5. Comparison of chromosome analysis using cell culture by coverslip technique with flask technique.

    Science.gov (United States)

    Sajapala, Suraphan; Buranawut, Kitti; NiwatArunyakasemsuk, Md

    2014-02-01

    To determine accuracy rate ofchromosome study from amniotic cellculture by coverslip technique compared with flask technique and to compared timing ofamniotic cell culture, amount ofamniotic cell culture media and cost ofamniotic cell culture. Cross sectional study. Department of Obstetrics and Gynecology, Phramongkutklao Hospital. Subjects: 70 pregnant women who underwent amniocentesis at Phramongkutklao Hospital during November 1, 2007 to February 29, 2008. Amniotic cell culture by flask technique and coverslip technique. Accuracy of amniotic cell culture for chromosome study by coverslip technique compared with flask technique. Totally 70 pregnant women who underwent to amniocentesis and dividedamniotic fluid to cell culture by flask technique and coverslip technique. 69 samples had similar resultfrom both techniques. The only one sample had cell culture failure inboth methods due to blood contamination. Accuracy in coverslip technique was 100% compared with flask technique. In timing of amniotic cell culture, amount ofamniotic cell culture media and cost of amniotic cell culture between 2 methods that coverslip technique was lesser than flask technique. There is statistically significant of accuracy in chromosome result between coverslip technique and flask technique. Coverslip technique was lesser than flask technique in timing, amniotic cell culture media and costs ofamniotic cell culture.

  6. Automated local bright feature image analysis of nuclear proteindistribution identifies changes in tissue phenotype

    Energy Technology Data Exchange (ETDEWEB)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-02-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues.

  7. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Paranjape, A.H.; Pandit, S.S.; Shinde, S.S.; Ramanujam, A.; Dhumwad, R.K.

    1991-01-01

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  8. Acceleration of multivariate analysis techniques in TMVA using GPUs

    CERN Document Server

    Hoecker, A; Therhaag, J; Washbrook, A

    2012-01-01

    A feasibility study into the acceleration of multivariate analysis techniques using Graphics Processing Units (GPUs) will be presented. The MLP-based Artificial Neural Network method contained in the TMVA framework has been chosen as a focus for investigation. It was found that the network training time on a GPU was lower than for CPU execution as the complexity of the network was increased. In addition, multiple neural networks can be trained simultaneously on a GPU within the same time taken for single network training on a CPU. This could be potentially leveraged to provide a qualitative performance gain in data classification.

  9. Reduction and analysis techniques for infrared imaging data

    Science.gov (United States)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  10. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  11. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  12. Human errors identification using the human factors analysis and classification system technique (HFACS

    Directory of Open Access Journals (Sweden)

    G. A. Shirali

    2013-12-01

    .Result: In this study, 158 reports of accident in Ahvaz steel industry were analyzed by HFACS technique. This analysis showed that most of the human errors were: in the first level was related to the skill-based errors, in the second to the physical environment, in the third level to the inadequate supervision and in the fourth level to the management of resources. .Conclusion: Studying and analyzing of past events using the HFACS technique can identify the major and root causes of accidents and can be effective on prevent repetitions of such mishaps. Also, it can be used as a basis for developing strategies to prevent future events in steel industries.

  13. Using Quantitative Data Analysis Techniques for Bankruptcy Risk Estimation for Corporations

    Directory of Open Access Journals (Sweden)

    Ştefan Daniel ARMEANU

    2012-01-01

    Full Text Available Diversification of methods and techniques for quantification and management of risk has led to the development of many mathematical models, a large part of which focused on measuring bankruptcy risk for businesses. In financial analysis there are many indicators which can be used to assess the risk of bankruptcy of enterprises but to make an assessment it is needed to reduce the number of indicators and this can be achieved through principal component, cluster and discriminant analyses techniques. In this context, the article aims to build a scoring function used to identify bankrupt companies, using a sample of companies listed on Bucharest Stock Exchange.

  14. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  15. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  16. Pattern recognition and data mining techniques to identify factors in wafer processing and control determining overlay error

    Science.gov (United States)

    Lam, Auguste; Ypma, Alexander; Gatefait, Maxime; Deckers, David; Koopman, Arne; van Haren, Richard; Beltman, Jan

    2015-03-01

    On-product overlay can be improved through the use of context data from the fab and the scanner. Continuous improvements in lithography and processing performance over the past years have resulted in consequent overlay performance improvement for critical layers. Identification of the remaining factors causing systematic disturbances and inefficiencies will further reduce overlay. By building a context database, mappings between context, fingerprints and alignment & overlay metrology can be learned through techniques from pattern recognition and data mining. We relate structure (`patterns') in the metrology data to relevant contextual factors. Once understood, these factors could be moved to the known effects (e.g. the presence of systematic fingerprints from reticle writing error or lens and reticle heating). Hence, we build up a knowledge base of known effects based on data. Outcomes from such an integral (`holistic') approach to lithography data analysis may be exploited in a model-based predictive overlay controller that combines feedback and feedforward control [1]. Hence, the available measurements from scanner, fab and metrology equipment are combined to reveal opportunities for further overlay improvement which would otherwise go unnoticed.

  17. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  18. Envelopment technique and topographic overlays in bite mark analysis.

    Science.gov (United States)

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05).

  19. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  20. Service Interaction Flow Analysis Technique for Service Personalization

    DEFF Research Database (Denmark)

    Korhonen, Olli; Kinnula, Marianne; Syrjanen, Anna-Liisa

    2017-01-01

    technology-mediated service interaction design is twofold: First, with the increased understanding on the role of personalization in managing variation in technology-mediated service interaction, our study contributes to designing service management information systems and human-computer interfaces......Service interaction flows are difficult to capture, analyze, outline, and represent for research and design purposes. We examine how variation of personalized service flows in technology-mediated service interaction can be modeled and analyzed to provide information on how service personalization...... could support interaction. We have analyzed service interaction cases in a context of technology-mediated car rental service. With the analysis technique we propose, inspired by Interaction Analysis method, we were able to capture and model the situational service interaction. Our contribution regarding...

  1. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  2. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four...... aqueous selenium standards were separated within 1.2 min on a 1.00 id x 50 mm reversed phase column in an ion-pair chromatographic system using a flow rate of 200 mu L min(-1). Hence, analysis times could be reduced to 1/10 compared with ordinary HPLC for aqueous standards. The precision and detection...... the use of short columns. Hence, analysis times could be halved without loss of separation efficiency in this biological sample...

  3. Identifying unrecognized collecting system entry and the integrity of repair during open partial nephrectomy: comparison of two techniques

    Directory of Open Access Journals (Sweden)

    Sandhya R. Rao

    2014-10-01

    Full Text Available Purpose To compare retrograde dye injection through an externalized ureteral catheter with direct needle injection of dye into proximal ureter for identification of unrecognized collecting system disruption and integrity of subsequent repair during open partial nephrectomy. Materials and Methods We retrospectively reviewed the records of 259 consecutive patients who underwent open partial nephrectomy. Externalized ureteral catheters were placed preoperatively in 110 patients (Group 1; needle injection of methylene blue directly into proximal ureter was used in 120 patients (Group 2. No assessment of the collecting system was performed in 29 patients (Group 3. We compared intraoperative parameters, tumor characteristics, collecting system entry and incidence of urine leaks among the three groups. Results The mean tumor diameter was 3.1cm in Group 1, 3.6cm in Group 2, and 3.8 cm in Group 3 (p = 0.04; mean EBL 320cc, 351 cc and 376cc (p = 0.5; mean operative time 193.5 minutes, 221 minutes and 290 minutes (p < 0.001. Collecting system entry was recognized in 63%, 76% and 38% of cases in Groups 1, 2 and 3 respectively. (p = 0.07. Postoperative urine leaks requiring some form of management occurred in 11 patients from group 1 and 6 from group 2. (p = 0.2. No patient in Group 3 developed a urinary leak. Conclusions Identification of unrecognized collecting system disruption as well as postoperative urine leak rate in patients undergoing partial nephrectomy were not influenced by the intraoperative technique of identifying unrecognized collecting system entry. Postoperative urine leaks are uncommon despite recognized collecting system disruption in the majority of patients.

  4. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  5. Analysis of minor phase with neutron diffraction technique

    International Nuclear Information System (INIS)

    Engkir Sukirman; Herry Mugirahardjo

    2014-01-01

    The presence of minor phases in a sample have been analyzed with the neutron diffraction technique. In this research, the sample of Fe nanoparticles (FNP) has been selected as the object of case study. The first step was to prepare the FNP sample with the ball milling technique. Hereinafter, the sample of milling result was referred FIC2. The presence of phases formed in FIC2 were analyzed qualitatively and quantitatively using the high resolution neutron diffraction (HRPD ) and X-Ray Diffraction (XRD) techniques. The diffraction data were analyzed by means of the Rietveld method utilizing a computer code, namely FullProf and performed by referring to the supporting data, namely particle size and magnetic properties of materials. The two kinds of supporting data were obtained from the PSA (Particles Size Analyzer) and VSM (Vibrating Samples Magnetometer), respectively. The analysis result shows that quality of fitting for neutron diffraction pattern is better than the fitting quality for x-ray diffraction pattern. Of the HRPD data were revealed that FIC2 consists of Fe, γFe 2 O 3 and Fe 3 O 4 phases as much as 78.62; 21.37 and 0.01%, respectively. Of the XRD data were obtained that FIC2 consists of Fe and γFe 2 O 3 phases with amount of 99.96 and 0.04%, respectively; the presence of Fe 3 O 4 phase was not observed. With the neutron diffraction technique, the presence of minor phase can be determined accurately. (author)

  6. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  7. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  8. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  9. Maintenance Audit through Value Analysis Technique: A Case Study

    Science.gov (United States)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  10. Geospatial techniques to Identify the Location of Farmers Markets and Community Gardens within Food Deserts in Virginia

    Science.gov (United States)

    Sriharan, S.; Meekins, D.; Comar, M.; Bradshaw, S.; Jackson, L.

    2017-12-01

    Specifically, a food desert is defined as an area where populations live more than one mile from a supermarket or large grocery store if in an urban area or more than 10 miles from a supermarket or large grocery store if in a rural area (Ver Ploeg et al. 2012). According to the U.S. Department of Agriculture, a food desert is "an area in the United States with limited access to affordable and nutritious food, particularly such an area composed of predominately lower-income neighborhoods and communities" (110th Congress 2008). Three fourths of these food deserts are urban. In the Commonwealth of Virginia, Petersburg City is among the eight primary localities, where its population is living in a food desert. This project will compare those identified food deserts in Virginia (areas around Virginia State University) with focus to where farmers markets and community gardens are being established. The hypothesis of this study is that these minority groups do not get healthy food due to limited access to grocery stores and superstores. To address this problem, the community development activities should focus on partnering local Petersburg convenience stores with farmers and community gardeners to sell fresh produce. Existing data was collected on convenient stores and community gardens in Petersburg City and Chesterfield County. Rare data was generated for Emporia, Lynchburg and Hopewell. The data was compiled through field work and mapping with ArcGIS where markets and gardens are being established, and create a spatial analysis of their location We have localities that reflect both rural and urban areas. The project provides educational support for students who will find solution to community problems by developing activities to: (a) define and examine characteristics of food deserts, (b) identify causes and consequences of food deserts and determine if their community is a food desert, (c) research closest food desert to their school, and (d) design solutions to help

  11. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  12. Determination of minor and trace elements concentration in kidney stones using elemental analysis techniques

    Science.gov (United States)

    Srivastava, Anjali

    The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. The X-ray fluorescence (XRF) and neutron activation analysis (NAA) experiments were performed and different kidney stones were analyzed. The interactions of X-ray photons and neutrons with matter are complementary in nature, resulting in distinctly different materials detection. This is the first approach to utilize combined X-ray fluorescence and neutron activation analysis for a comprehensive analysis of the kideny stones. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. The use of open source program Python Multi-Channel Analyzer was utilized to unfold the XRF spectrum. A new type of experimental set-up was developed and utilized for XRF and NAA analysis of the kidney stone. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF and NAA technique. The elements which were identified from XRF technique are Br, Cu, Ga, Ge, Mo, Nb, Ni, Rb, Se, Sr, Y, Zr. And, by using Neutron Activation Analysis (NAA) are Au, Br, Ca, Er, Hg, I, K, Na, Pm, Sb, Sc, Sm, Tb, Yb, Zn. This thesis presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF and NAA instrumental activation analysis techniques.

  13. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  14. Coexpression analysis identifies nuclear reprogramming barriers of somatic cell nuclear transfer embryos.

    Science.gov (United States)

    Zuo, Yongchun; Su, Guanghua; Cheng, Lei; Liu, Kun; Feng, Yu; Wei, Zhuying; Bai, Chunling; Cao, Guifang; Li, Guangpeng

    2017-09-12

    The success of cloned animal "Dolly Sheep" demonstrated the somatic cell nuclear transfer (SCNT) technique holds huge potentials for mammalian asexual reproduction. However, the extremely poor development of SCNT embryos indicates their molecular mechanism remain largely unexplored. Deciphering the spatiotemporal patterns of gene expression in SCNT embryos is a crucial step toward understanding the mechanisms associated with nuclear reprogramming. In this study, a valuable transcriptome recourse of SCNT embryos was firstly established, which derived from different inter-/intra donor cells. The gene co-expression analysis identified 26 cell-specific modules, and a series of regulatory pathways related to reprogramming barriers were further enriched. Compared to the intra-SCNT embryos, the inter-SCNT embryos underwent only complete partially reprogramming. As master genome trigger genes, the transcripts related to TFIID subunit, RNA polymerase and mediators were incomplete activated in inter-SCNT embryos. The inter-SCNT embryos only wasted the stored maternal mRNA of master regulators, but failed to activate their self-sustained pathway of RNA polymerases. The KDM family of epigenetic regulator also seriously delayed in inter-SCNT embryo reprogramming process. Our study provided new insight into understanding of the mechanisms of nuclear reprogramming.

  15. Probabilistic Latent Semantic Analysis Applied to Whole Bacterial Genomes Identifies Common Genomic Features

    Directory of Open Access Journals (Sweden)

    Rusakovica J.

    2014-06-01

    Full Text Available The spread of drug resistance amongst clinically-important bacteria is a serious, and growing, problem [1]. However, the analysis of entire genomes requires considerable computational effort, usually including the assembly of the genome and subsequent identification of genes known to be important in pathology. An alternative approach is to use computational algorithms to identify genomic differences between pathogenic and non-pathogenic bacteria, even without knowing the biological meaning of those differences. To overcome this problem, a range of techniques for dimensionality reduction have been developed. One such approach is known as latent-variable models [2]. In latent-variable models dimensionality reduction is achieved by representing a high-dimensional data by a few hidden or latent variables, which are not directly observed but inferred from the observed variables present in the model. Probabilistic Latent Semantic Indexing (PLSA is an extention of LSA [3]. PLSA is based on a mixture decomposition derived from a latent class model. The main objective of the algorithm, as in LSA, is to represent high-dimensional co-occurrence information in a lower-dimensional way in order to discover the hidden semantic structure of the data using a probabilistic framework.

  16. DNA barcoding techniques used to identify the shared ichthyofauna between the Pantanal floodplain and Upper Parana River.

    Science.gov (United States)

    da Costa-Silva, Guilherme J; Yuldi Ashikaga, Fernando; Kioko Shimabukuro Dias, Cristiane; Garcia Pereira, Luiz Henrique; Foresti, Fausto; Oliveira, Claudio

    2017-11-20

    The biological invasion process is widely debated topic, as the population depletion of some species and the extinction of others are related to this process. To accelerate the identification of species and to detect non-native forms, new tools are being developed, such as those based on genetic markers. This study aimed to use Barcode DNA methodology to identify fish species that had translocated between the Parana and Paraguay River Basins. Based on a database of two studies that were conducted in these regions, 289 sequences of Cytochrome Oxidase C subunit 1 (COI) were used for General Mixed Youle Coalecent (GMYC) analysis, including 29 morphospecies that were sampled in both river basins. As a result, we observed that while some morphospecies have low variation, demonstrating a recent occupation of the basins, other morphospecies probably represent species complexes. A third of the morphospecies had well-defined lineages but not enough to be treated as different Molecular Operational Taxonomic Units (MOTUs). These results demonstrate that human interventions possibly participated in the distribution of some lineages. However, biogeographical historical processes are also important for the morphospecies distribution. The data suggest that the number of species that are present in these two basins is underestimated and that human actions can irreversibly affect the natural history of the species in these regions.

  17. Motor current and leakage flux signature analysis technique for condition monitoring

    International Nuclear Information System (INIS)

    Pillai, M.V.; Moorthy, R.I.K.; Mahajan, S.C.

    1994-01-01

    Till recently analysis of vibration signals was the only means available to predict the state of health of plant equipment. Motor current and leakage magnetic flux signature analysis is acquiring importance as a technique for detection of incipient damages in the electrical machines and as a supplementary technique for diagnostics of driven equipment such as centrifugal and reciprocating pumps. The state of health of the driven equipment is assessed by analysing time signal, frequency spectrum and trend analysis. For example, the pump vane frequency, piston stroke frequency, gear frequency and bearing frequencies are indicated in the current and flux spectra. By maintaining a periodic record of the amplitudes of various frequency lines in the frequency spectra, it is possible to understand the trend of deterioration of parts and components of the pump. All problems arising out of inappropriate mechanical alignment of vertical pumps are easily identified by a combined analysis of current, flux and vibration signals. It is found that current signature analysis technique is a sufficient method in itself for the analysis of state of health of reciprocating pumps and compressors. (author). 10 refs., 4 figs

  18. Statistical techniques to construct assays for identifying likely responders to a treatment under evaluation from cell line genomic data

    International Nuclear Information System (INIS)

    Huang, Erich P; Fridlyand, Jane; Lewin-Koh, Nicholas; Yue, Peng; Shi, Xiaoyan; Dornan, David; Burington, Bart

    2010-01-01

    Developing the right drugs for the right patients has become a mantra of drug development. In practice, it is very difficult to identify subsets of patients who will respond to a drug under evaluation. Most of the time, no single diagnostic will be available, and more complex decision rules will be required to define a sensitive population, using, for instance, mRNA expression, protein expression or DNA copy number. Moreover, diagnostic development will often begin with in-vitro cell-line data and a high-dimensional exploratory platform, only later to be transferred to a diagnostic assay for use with patient samples. In this manuscript, we present a novel approach to developing robust genomic predictors that are not only capable of generalizing from in-vitro to patient, but are also amenable to clinically validated assays such as qRT-PCR. Using our approach, we constructed a predictor of sensitivity to dacetuzumab, an investigational drug for CD40-expressing malignancies such as lymphoma using genomic measurements of cell lines treated with dacetuzumab. Additionally, we evaluated several state-of-the-art prediction methods by independently pairing the feature selection and classification components of the predictor. In this way, we constructed several predictors that we validated on an independent DLBCL patient dataset. Similar analyses were performed on genomic measurements of breast cancer cell lines and patients to construct a predictor of estrogen receptor (ER) status. The best dacetuzumab sensitivity predictors involved ten or fewer genes and accurately classified lymphoma patients by their survival and known prognostic subtypes. The best ER status classifiers involved one or two genes and led to accurate ER status predictions more than 85% of the time. The novel method we proposed performed as well or better than other methods evaluated. We demonstrated the feasibility of combining feature selection techniques with classification methods to develop assays

  19. TU-EF-BRD-02: Indicators and Technique Analysis

    International Nuclear Information System (INIS)

    Carlone, M.

    2015-01-01

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  20. Alimentary tract bacteria isolated and identified with API-20E and molecular cloning techniques from Australian tropical fruit flies, Bactrocera cacuminata and B. tryoni.

    Science.gov (United States)

    Thaochan, N; Drew, R A I; Hughes, J M; Vijaysegaran, S; Chinajariyawong, A

    2010-01-01

    Bacteria were isolated from the crop and midgut of field collected Bactrocera cacuminata (Hering) and Bactrocera tryoni (Froggatt) (Diptera: Tephritidae). Two methods were used, firstly isolation onto two types of bacteriological culture media (PYEA and TSA) and identification using the API-20E diagnostic kit, and secondly, analysis of samples using the 16S rRNA gene molecular diagnostic method. Using the API-20E method, 10 genera and 17 species of bacteria in the family Enterobacteriaceae were identified from cultures growing on the nutrient agar. The dominant species in both the crop and midgut were Citrobacter freundii, Enterobacter cloacae and Klebsiella oxytoca. Providencia rettgeri, Klebsiella pneumoniae ssp ozaenae and Serratia marcescens were isolated from B. tryoni only. Using the molecular cloning technique that is based on 16S rRNA gene sequences, five bacteria classes were dignosed — Alpha-, Beta-, Gamma- and Delta- Proteobacteria and Firmicutes — including five families, Leuconostocaceae, Enterococcaceae, Acetobacteriaceae, Comamonadaceae and Enterobacteriaceae. The bacteria affiliated with Firmicutes were found mainly in the crop while the Gammaproteobacteria, especially the family Enterobacteriaceae, was dominant in the midgut. This paper presents results from the first known application of molecular cloning techniques to study bacteria within tephritid species and the first record of Firmicutes bacteria in these flies.

  1. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Directory of Open Access Journals (Sweden)

    Richard E.A. van Emmerik

    2016-03-01

    Full Text Available Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1 maintain pattern stability, (2 transition into new states, and (3 are governed by short- and long-term (fractal correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  2. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  3. Techniques of DNA methylation analysis with nutritional applications.

    Science.gov (United States)

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests. Copyright © 2013 S. Karger AG, Basel.

  4. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  5. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  6. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C [University of Washington, Seattle; Abgrall, N. [Lawrence Berkeley National Laboratory (LBNL); Arnquist, I. J. [Pacific Northwest National Laboratory (PNNL); Avignone, III, F. T. [University of South Carolina/Oak Ridge National Laboratory (ORNL); Baldenegro-Barrera, C. X. [Oak Ridge National Laboratory (ORNL); Barabash, A.S. [Institute of Theoretical & Experimental Physics (ITEP), Moscow, Russia; Bertrand, F. E. [Oak Ridge National Laboratory (ORNL); Bradley, A. W. [Lawrence Berkeley National Laboratory (LBNL); Brudanin, V. [Joint Institute for Nuclear Research, Dubna, Russia; Busch, M. [Duke University/TUNL; Buuck, M. [University of Washington, Seattle; Byram, D. [University of South Dakota; Caldwell, A. S. [South Dakota School of Mines and Technology; Chan, Y-D [Lawrence Berkeley National Laboratory (LBNL); Christofferson, C. D. [South Dakota School of Mines and Technology; Detwiler, J. A. [University of Washington, Seattle; Efremenko, Yu. [University of Tennessee, Knoxville (UTK); Ejiri, H. [Osaka University, Japan; Elliott, S. R. [Los Alamos National Laboratory (LANL); Galindo-Uribarri, A. [Oak Ridge National Laboratory (ORNL); Gilliss, T. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Giovanetti, G. K. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Goett, J [Los Alamos National Laboratory (LANL); Green, M. P. [Oak Ridge National Laboratory (ORNL); Gruszko, J [University of Washington, Seattle; Guinn, I S [University of Washington, Seattle; Guiseppe, V E [University of South Carolina, Columbia; Henning, R. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Hoppe, E.W. [Pacific Northwest National Laboratory (PNNL); Howard, S. [South Dakota School of Mines and Technology; Howe, M. A. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Jasinski, B R [University of South Dakota; Keeter, K.J. [Black Hills State University, Spearfish, South Dakota; Kidd, M. F. [Tennessee Technological University (TTU); Konovalov, S.I. [Institute of Theoretical & Experimental Physics (ITEP), Moscow, Russia; Kouzes, R. T. [Pacific Northwest National Laboratory (PNNL); LaFerriere, B. D. [Pacific Northwest National Laboratory (PNNL); Leon, J. [University of Washington, Seattle; MacMullin, J. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Martin, R. D. [University of South Dakota; Meijer, S. J. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Mertens, S. [Lawrence Berkeley National Laboratory (LBNL); Orrell, J. L. [Pacific Northwest National Laboratory (PNNL); O' Shaughnessy, C. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Poon, A.W.P. [Lawrence Berkeley National Laboratory (LBNL); Radford, D. C. [Oak Ridge National Laboratory (ORNL); Rager, J. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Rielage, K. [Los Alamos National Laboratory (LANL); Robertson, R.G.H. [University of Washington, Seattle; Romero-Romero, E. [University of Tennessee, Knoxville, (UTK)/Oak Ridge National Lab (ORNL); Shanks, B. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Shirchenko, M. [Joint Institute for Nuclear Research, Dubna, Russia; Snyder, N [University of South Dakota; Suriano, A. M. [South Dakota School of Mines and Technology; Tedeschi, D [University of South Carolina, Columbia; Trimble, J. E. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Varner, R. L. [Oak Ridge National Laboratory (ORNL); Vasilyev, S. [Joint Institute for Nuclear Research, Dubna, Russia; Vetter, K. [University of California/Lawrence Berkeley National Laboratory (LBNL); et al.

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in Ge-76. In view of the next generation of tonne-scale Ge-based 0 nu beta beta-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  7. Evaluation of tritium analysis techniques for a continuous tritium monitor

    International Nuclear Information System (INIS)

    Fernandez, S.J.; Girton, R.C.

    1978-04-01

    Present methods for tritium monitoring are evaluated and a program is proposed to modify the existing methods or develop new instrumentation to establish a state-of-the-art monitoring capability for nuclear fuel reprocessing plants. The capabilities, advantages, and disadvantages of the most popular counting and separation techniques are described. The following criteria were used to evaluate present methods: specificity, selectivity, precision, insensitivity to gamma radiation, and economy. A novel approach is explored to continuously separate the tritium from a complex mixture of stack gases. This approach, based on the different permeabilities of the stack gas constituents, is integrated into a complete monitoring system. This monitoring system is designed to perform real time tritium analysis. A schedule is presented for development and demonstration of the completed system

  8. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  9. Advances in zymography techniques and patents regarding protease analysis.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2012-08-01

    Detection of enzymatic activity on gel electrophoresis, namely zymography, is a technique that has received increasing attention in the last 10 years, according to the number of articles published. A growing amount of enzymes, mainly proteases, are now routinely detected by zymography. Detailed analytical studies are beginning to be published, as well as new patents have been developed. This new article updates the information covered in our last review, condensing the recent publications dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations. The new advances of this method are basically focused towards two dimensional zymography and transfer zymography. Though comparatively fewer patents have been published, they basically coincide in the study of matrix metalloproteases. The tendency is foreseen to be very productive in the area of zymoproteomics, combining electrophoresis and mass spectrometry for the analysis of proteases.

  10. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  11. Use of nuclear techniques for coal analysis in exploration, mining and processing

    International Nuclear Information System (INIS)

    Clayton, C.G.; Wormald, M.R.

    1982-01-01

    Nuclear techniques have a long history of application in the coal industry, during exploration and especially during coal preparation, for the measurement of ash content. The preferred techniques are based on X- and gamma-ray scattering and borehole logging, and on-line equipment incorporating these techniques are now in world-wide routine use. However, gamma-ray techniques are mainly restricted to density measurement and X-ray techniques are principally used for ash determinations. They have a limited range and when used on-line some size reduction of the coal is usually required and a full elemental analysis is not possible. In particular, X- and gamma-ray techniques are insensitive to the principal elements in the combustible component and to many of the important elements in the mineral fraction. Neutron techniques on the other hand have a range which is compatible with on-line requirements and all elements in the combustible component and virtually all elements in the mineral component can be observed. A complete elemental analysis of coal then allows the ash content and the calorific value to be determined on-line. This paper surveys the various nuclear techniques now in use and gives particular attention to the present state of development of neutron methods and to their advantages and limitations. Although it is shown that considerable further development and operational experience are still required, equipment now being introduced has a performance which matches many of the identified requirements and an early improvement in specification can be anticipated

  12. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    Science.gov (United States)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  13. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  14. Identifying At-Risk Students in General Chemistry via Cluster Analysis of Affective Characteristics

    Science.gov (United States)

    Chan, Julia Y. K.; Bauer, Christopher F.

    2014-01-01

    The purpose of this study is to identify academically at-risk students in first-semester general chemistry using affective characteristics via cluster analysis. Through the clustering of six preselected affective variables, three distinct affective groups were identified: low (at-risk), medium, and high. Students in the low affective group…

  15. Network analysis of translocated Takahe populations to identify disease surveillance targets.

    Science.gov (United States)

    Grange, Zoë L; VAN Andel, Mary; French, Nigel P; Gartrell, Brett D

    2014-04-01

    Social network analysis is being increasingly used in epidemiology and disease modeling in humans, domestic animals, and wildlife. We investigated this tool in describing a translocation network (area that allows movement of animals between geographically isolated locations) used for the conservation of an endangered flightless rail, the Takahe (Porphyrio hochstetteri). We collated records of Takahe translocations within New Zealand and used social network principles to describe the connectivity of the translocation network. That is, networks were constructed and analyzed using adjacency matrices with values based on the tie weights between nodes. Five annual network matrices were created using the Takahe data set, each incremental year included records of previous years. Weights of movements between connected locations were assigned by the number of Takahe moved. We calculated the number of nodes (i(total)) and the number of ties (t(total)) between the nodes. To quantify the small-world character of the networks, we compared the real networks to random graphs of the equivalent size, weighting, and node strength. Descriptive analysis of cumulative annual Takahe movement networks involved determination of node-level characteristics, including centrality descriptors of relevance to disease modeling such as weighted measures of in degree (k(i)(in)), out degree (k(i)(out)), and betweenness (B(i)). Key players were assigned according to the highest node measure of k(i)(in), k(i)(out), and B(i) per network. Networks increased in size throughout the time frame considered. The network had some degree small-world characteristics. Nodes with the highest cumulative tie weights connecting them were the captive breeding center, the Murchison Mountains and 2 offshore islands. The key player fluctuated between the captive breeding center and the Murchison Mountains. The cumulative networks identified the captive breeding center every year as the hub of the network until the final

  16. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    2010-11-01

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  17. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Aguado, M.T.; Marcelles, I.

    1993-01-01

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  18. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  19. Crystallographic texture analysis of archaeological metals: interpretation of manufacturing techniques

    International Nuclear Information System (INIS)

    Artioli, G.

    2007-01-01

    Neutron probes and high energy X-rays are sources of primary importance for the non-invasive characterization of materials related to cultural heritage. Their employment in the characterization of archaeological metal objects, combined with the recent instrumental and computational developments in the field of crystallographic texture analysis (CTA) from diffraction data proves to be a powerful tool for the interpretation of ancient metal working techniques. Diffraction based CTA, when performed using penetrating probes and adequate detector coverage of reciprocal space, for example using large detector arrays and/or ToF mode, allows simultaneous identification and quantification of crystalline phases, besides the microstructural and textural characterization of the object, and it can be effectively used as a totally non-invasive tool for metallographic analysis. Furthermore, the chemical composition of the object may also be obtained by the simultaneous detection of prompt gamma rays induced by neutron activation, or by the fluorescence signal from high energy X-rays, in order to obtain a large amount of complementary information in a single experiment. The specific application of neutron CTA to the characterization of the manufacturing processes of prehistoric copper axes is discussed in detail. (orig.)

  20. Crystallographic texture analysis of archaeological metals: interpretation of manufacturing techniques

    Science.gov (United States)

    Artioli, G.

    2007-12-01

    Neutron probes and high energy X-rays are sources of primary importance for the non-invasive characterization of materials related to cultural heritage. Their employment in the characterization of archaeological metal objects, combined with the recent instrumental and computational developments in the field of crystallographic texture analysis (CTA) from diffraction data proves to be a powerful tool for the interpretation of ancient metal working techniques. Diffraction based CTA, when performed using penetrating probes and adequate detector coverage of reciprocal space, for example using large detector arrays and/or ToF mode, allows simultaneous identification and quantification of crystalline phases, besides the microstructural and textural characterization of the object, and it can be effectively used as a totally non-invasive tool for metallographic analysis. Furthermore, the chemical composition of the object may also be obtained by the simultaneous detection of prompt gamma rays induced by neutron activation, or by the fluorescence signal from high energy X-rays, in order to obtain a large amount of complementary information in a single experiment. The specific application of neutron CTA to the characterization of the manufacturing processes of prehistoric copper axes is discussed in detail.

  1. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  2. Demonstration of statistical approaches to identify component's ageing by operational data analysis-A case study for the ageing PSA network

    International Nuclear Information System (INIS)

    Rodionov, Andrei; Atwood, Corwin L.; Kirchsteiger, Christian; Patrik, Milan

    2008-01-01

    The paper presents some results of a case study on 'Demonstration of statistical approaches to identify the component's ageing by operational data analysis', which was done in the frame of the EC JRC Ageing PSA Network. Several techniques: visual evaluation, nonparametric and parametric hypothesis tests, were proposed and applied in order to demonstrate the capacity, advantages and limitations of statistical approaches to identify the component's ageing by operational data analysis. Engineering considerations are out of the scope of the present study

  3. The analysis of gastric function using computational techniques

    International Nuclear Information System (INIS)

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  4. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    Directory of Open Access Journals (Sweden)

    João Cardoso

    2013-05-01

    Full Text Available The Arterial Pressure Waveform (APW can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1 a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2 the acquired position and amplitude of onset, Systolic Peak (SP, Point of Inflection (Pi and Dicrotic Wave (DW were used for the computation of some morphological attributes; (3 pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4 classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic, J48 (decision tree and RIPPER (rule-based induction; and (5 we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx. Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95% and high area under the curve (AUC of a Receiver Operating Characteristic (ROC curve (0.961. Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation.

  5. Neutron activation analysis techniques for identifying elemental status in Alzheimer's disease

    International Nuclear Information System (INIS)

    Ward, N.I.

    1987-01-01

    Brain tissue (hippocampus and cerebral cortex) from Alzheimer's disease and control individuals sampled from Eastern Canada and the United Kingdom were analyzed for Ag, Al, As, B, Br, Ca, Cd, Co, Cr, Cs, Cu, Fe, Hg, I, K, La, Mg, Mn, Mo, Ni, Rb, S, Sb, Sc, Se, Si, Sn, Sr, Ti, V and Zn. NAA (thermal and prompt gamma-ray) methods were used. Highly significant differences (probability less than 0.005) for both study areas were shown between Alzheimer's disease and control individuals. No statistical evidence of aluminium accumulation with age was noted. Possible zinc dificiency was observed. (author) 21 refs.; 5 tables

  6. Large-scale association analysis identifies 13 new susceptibility loci for coronary artery disease

    NARCIS (Netherlands)

    Schunkert, Heribert; König, Inke R.; Kathiresan, Sekar; Reilly, Muredach P.; Assimes, Themistocles L.; Holm, Hilma; Preuss, Michael; Stewart, Alexandre F. R.; Barbalic, Maja; Gieger, Christian; Absher, Devin; Aherrahrou, Zouhair; Allayee, Hooman; Altshuler, David; Anand, Sonia S.; Andersen, Karl; Anderson, Jeffrey L.; Ardissino, Diego; Ball, Stephen G.; Balmforth, Anthony J.; Barnes, Timothy A.; Becker, Diane M.; Becker, Lewis C.; Berger, Klaus; Bis, Joshua C.; Boekholdt, S. Matthijs; Boerwinkle, Eric; Braund, Peter S.; Brown, Morris J.; Burnett, Mary Susan; Buysschaert, Ian; Carlquist, John F.; Chen, Li; Cichon, Sven; Codd, Veryan; Davies, Robert W.; Dedoussis, George; Dehghan, Abbas; Demissie, Serkalem; Devaney, Joseph M.; Diemert, Patrick; Do, Ron; Doering, Angela; Eifert, Sandra; Mokhtari, Nour Eddine El; Ellis, Stephen G.; Elosua, Roberto; Engert, James C.; Epstein, Stephen E.; de Faire, Ulf; Fischer, Marcus; Folsom, Aaron R.; Freyer, Jennifer; Gigante, Bruna; Girelli, Domenico; Gretarsdottir, Solveig; Gudnason, Vilmundur; Gulcher, Jeffrey R.; Halperin, Eran; Hammond, Naomi; Hazen, Stanley L.; Hofman, Albert; Horne, Benjamin D.; Illig, Thomas; Iribarren, Carlos; Jones, Gregory T.; Jukema, J. Wouter; Kaiser, Michael A.; Kaplan, Lee M.; Kastelein, John J. P.; Khaw, Kay-Tee; Knowles, Joshua W.; Kolovou, Genovefa; Kong, Augustine; Laaksonen, Reijo; Lambrechts, Diether; Leander, Karin; Lettre, Guillaume; Li, Mingyao; Lieb, Wolfgang; Loley, Christina; Lotery, Andrew J.; Mannucci, Pier M.; Maouche, Seraya; Martinelli, Nicola; McKeown, Pascal P.; Meisinger, Christa; Meitinger, Thomas; Melander, Olle; Merlini, Pier Angelica; Mooser, Vincent; Morgan, Thomas; Mühleisen, Thomas W.; Muhlestein, Joseph B.; Münzel, Thomas; Musunuru, Kiran; Nahrstaedt, Janja; Nelson, Christopher P.; Nöthen, Markus M.; Olivieri, Oliviero; Patel, Riyaz S.; Patterson, Chris C.; Peters, Annette; Peyvandi, Flora; Qu, Liming; Quyyumi, Arshed A.; Rader, Daniel J.; Rallidis, Loukianos S.; Rice, Catherine; Rosendaal, Frits R.; Rubin, Diana; Salomaa, Veikko; Sampietro, M. Lourdes; Sandhu, Manj S.; Schadt, Eric; Schäfer, Arne; Schillert, Arne; Schreiber, Stefan; Schrezenmeir, Jürgen; Schwartz, Stephen M.; Siscovick, David S.; Sivananthan, Mohan; Sivapalaratnam, Suthesh; Smith, Albert; Smith, Tamara B.; Snoep, Jaapjan D.; Soranzo, Nicole; Spertus, John A.; Stark, Klaus; Stirrups, Kathy; Stoll, Monika; Tang, W. H. Wilson; Tennstedt, Stephanie; Thorgeirsson, Gudmundur; Thorleifsson, Gudmar; Tomaszewski, Maciej; Uitterlinden, Andre G.; van Rij, Andre M.; Voight, Benjamin F.; Wareham, Nick J.; Wells, George A.; Wichmann, H.-Erich; Wild, Philipp S.; Willenborg, Christina; Witteman, Jaqueline C. M.; Wright, Benjamin J.; Ye, Shu; Zeller, Tanja; Ziegler, Andreas; Cambien, Francois; Goodall, Alison H.; Cupples, L. Adrienne; Quertermous, Thomas; März, Winfried; Hengstenberg, Christian; Blankenberg, Stefan; Ouwehand, Willem H.; Hall, Alistair S.; Deloukas, Panos; Thompson, John R.; Stefansson, Kari; Roberts, Robert; Thorsteinsdottir, Unnur; O'Donnell, Christopher J.; McPherson, Ruth; Erdmann, Jeanette; Samani, Nilesh J.

    2011-01-01

    We performed a meta-analysis of 14 genome-wide association studies of coronary artery disease (CAD) comprising 22,233 individuals with CAD (cases) and 64,762 controls of European descent followed by genotyping of top association signals in 56,682 additional individuals. This analysis identified 13

  7. A dynamic mechanical analysis technique for porous media.

    Science.gov (United States)

    Pattison, Adam Jeffry; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-02-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite-element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a nonlinear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1-14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  8. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  9. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Lubna Moin

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  10. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  11. Transcriptome Analysis of Syringa oblata Lindl. Inflorescence Identifies Genes Associated with Pigment Biosynthesis and Scent Metabolism.

    Directory of Open Access Journals (Sweden)

    Jian Zheng

    Full Text Available Syringa oblata Lindl. is a woody ornamental plant with high economic value and characteristics that include early flowering, multiple flower colors, and strong fragrance. Despite a long history of cultivation, the genetics and molecular biology of S. oblata are poorly understood. Transcriptome and expression profiling data are needed to identify genes and to better understand the biological mechanisms of floral pigments and scents in this species. Nine cDNA libraries were obtained from three replicates of three developmental stages: inflorescence with enlarged flower buds not protruded, inflorescence with corolla lobes not displayed, and inflorescence with flowers fully opened and emitting strong fragrance. Using the Illumina RNA-Seq technique, 319,425,972 clean reads were obtained and were assembled into 104,691 final unigenes (average length of 853 bp, 41.75% of which were annotated in the NCBI non-redundant protein database. Among the annotated unigenes, 36,967 were assigned to gene ontology categories and 19,956 were assigned to eukaryoticorthologous groups. Using the Kyoto Encyclopedia of Genes and Genomes pathway database, 12,388 unigenes were sorted into 286 pathways. Based on these transcriptomic data, we obtained a large number of candidate genes that were differentially expressed at different flower stages and that were related to floral pigment biosynthesis and fragrance metabolism. This comprehensive transcriptomic analysis provides fundamental information on the genes and pathways involved in flower secondary metabolism and development in S. oblata, providing a useful database for further research on S. oblata and other plants of genus Syringa.

  12. Potential Coastal Pumped Hydroelectric Energy Storage Locations Identified using GIS-based Topographic Analysis

    Science.gov (United States)

    Parsons, R.; Barnhart, C. J.; Benson, S. M.

    2013-12-01

    Large-scale electrical energy storage could accommodate variable, weather dependent energy resources such as wind and solar. Pumped hydroelectric energy storage (PHS) and compressed energy storage area (CAES) have life cycle energy and financial costs that are an order of magnitude lower than conventional electrochemical storage technologies. However PHS and CAES storage technologies require specific geologic conditions. Conventional PHS requires an upper and lower reservoir separated by at least 100 m of head, but no more than 10 km in horizontal distance. Conventional PHS also impacts fresh water supplies, riparian ecosystems, and hydrologic environments. A PHS facility that uses the ocean as the lower reservoir benefits from a smaller footprint, minimal freshwater impact, and the potential to be located near off shore wind resources and population centers. Although technologically nascent, today one coastal PHS facility exists. The storage potential for coastal PHS is unknown. Can coastal PHS play a significant role in augmenting future power grids with a high faction of renewable energy supply? In this study we employ GIS-based topographic analysis to quantify the coastal PHS potential of several geographic locations, including California, Chile and Peru. We developed automated techniques that seek local topographic minima in 90 m spatial resolution shuttle radar topography mission (SRTM) digital elevation models (DEM) that satisfy the following criteria conducive to PHS: within 10 km from the sea; minimum elevation 150 m; maximum elevation 1000 m. Preliminary results suggest the global potential for coastal PHS could be very significant. For example, in northern Chile we have identified over 60 locations that satisfy the above criteria. Two of these locations could store over 10 million cubic meters of water or several GWh of energy. We plan to report a global database of candidate coastal PHS locations and to estimate their energy storage capacity.

  13. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    Science.gov (United States)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  14. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  15. Trends in grazing emission x-ray analysis techniques

    International Nuclear Information System (INIS)

    Grieken, R. van; Tsuji, K.; Injuk, J.

    2000-01-01

    then, the detection limits imposed by the semiconductor industry roadmap can probably not be obtained by tube-excited GEXRF. Th perspectives for tube-excited GE-XRF are thus rather poor. Future developments imply the combination of GEXRF with synchrotron radiation excitation. Grazing-emission particle-induced X-ray emission (GE-PIXE) suffers of similar quantification Problems for material deposited on a carrier, but it makes PIXE a surface-sensitive technique, while normally the protons penetrate some tens of μm in the sample. Similarly, grazing-emission electron probe micro-analysis (GE-EPNIA) allows to selectively analyze particles on a flat carrier, allows surface sensitivities in the nm rather than μ range, and yields, in principle, a spatial resolution for chemical analysis similar to the size of the impinging electron beam, rather than of the electron-excited volume. Both GE-PIXE and GE-EPMA need to be explored more fully in the near future. (author)

  16. Identifying influential individuals on intensive care units: using cluster analysis to explore culture.

    Science.gov (United States)

    Fong, Allan; Clark, Lindsey; Cheng, Tianyi; Franklin, Ella; Fernandez, Nicole; Ratwani, Raj; Parker, Sarah Henrickson

    2017-07-01

    The objective of this paper is to identify attribute patterns of influential individuals in intensive care units using unsupervised cluster analysis. Despite the acknowledgement that culture of an organisation is critical to improving patient safety, specific methods to shift culture have not been explicitly identified. A social network analysis survey was conducted and an unsupervised cluster analysis was used. A total of 100 surveys were gathered. Unsupervised cluster analysis was used to group individuals with similar dimensions highlighting three general genres of influencers: well-rounded, knowledge and relational. Culture is created locally by individual influencers. Cluster analysis is an effective way to identify common characteristics among members of an intensive care unit team that are noted as highly influential by their peers. To change culture, identifying and then integrating the influencers in intervention development and dissemination may create more sustainable and effective culture change. Additional studies are ongoing to test the effectiveness of utilising these influencers to disseminate patient safety interventions. This study offers an approach that can be helpful in both identifying and understanding influential team members and may be an important aspect of developing methods to change organisational culture. © 2017 John Wiley & Sons Ltd.

  17. Analysis of Biomechanical Structure and Passing Techniques in Basketball

    Directory of Open Access Journals (Sweden)

    Ricardo E. Izzo

    2011-06-01

    Full Text Available The basketball is a complex sport, which these days has become increasingly linked to its’ psychophysical aspects rather than to the technical ones. Therefore, it is important to make a through study of the passing techniques from the point of view of the type of the pass and its’ biomechanics. From the point of view of the type of the used passes, the most used is the two-handed chest pass with a frequency of 39.9%. This is followed, in terms of frequency, by one-handed passes – the baseball, with 20.9 % – and by the two-handed over the head pass, with 18.2 %, and finally, one- or two-handed indirect passes (bounces, with 11.2 % and 9.8 %. Considering the most used pass in basketball, from the biomechanical point of view, the muscles involved in the correct movement consider all the muscles of the upper extremity, adding also the shoulder muscles as well as the body fixators (abdominals, hip flexors, knee extensors, and dorsal flexors of the foot. The technical and conditional analysis considers the throwing speed, the throw height and the air resistance. In conclusion, the aim of this study is to give some guidelines to improve the mechanical execution of the movements in training, without neglecting the importance of the harmony of the movements themselves.

  18. Analysis of Program Obfuscation Schemes with Variable Encoding Technique

    Science.gov (United States)

    Fukushima, Kazuhide; Kiyomoto, Shinsaku; Tanaka, Toshiaki; Sakurai, Kouichi

    Program analysis techniques have improved steadily over the past several decades, and software obfuscation schemes have come to be used in many commercial programs. A software obfuscation scheme transforms an original program or a binary file into an obfuscated program that is more complicated and difficult to analyze, while preserving its functionality. However, the security of obfuscation schemes has not been properly evaluated. In this paper, we analyze obfuscation schemes in order to clarify the advantages of our scheme, the XOR-encoding scheme. First, we more clearly define five types of attack models that we defined previously, and define quantitative resistance to these attacks. Then, we compare the security, functionality and efficiency of three obfuscation schemes with encoding variables: (1) Sato et al.'s scheme with linear transformation, (2) our previous scheme with affine transformation, and (3) the XOR-encoding scheme. We show that the XOR-encoding scheme is superior with regard to the following two points: (1) the XOR-encoding scheme is more secure against a data-dependency attack and a brute force attack than our previous scheme, and is as secure against an information-collecting attack and an inverse transformation attack as our previous scheme, (2) the XOR-encoding scheme does not restrict the calculable ranges of programs and the loss of efficiency is less than in our previous scheme.

  19. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  20. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  1. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  2. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  3. Electroencephalographic Data Analysis With Visibility Graph Technique for Quantitative Assessment of Brain Dysfunction.

    Science.gov (United States)

    Bhaduri, Susmita; Ghosh, Dipak

    2015-07-01

    Usual techniques for electroencephalographic (EEG) data analysis lack some of the important properties essential for quantitative assessment of the progress of the dysfunction of the human brain. EEG data are essentially nonlinear and this nonlinear time series has been identified as multi-fractal in nature. We need rigorous techniques for such analysis. In this article, we present the visibility graph as the latest, rigorous technique that can assess the degree of multifractality accurately and reliably. Moreover, it has also been found that this technique can give reliable results with test data of comparatively short length. In this work, the visibility graph algorithm has been used for mapping a time series-EEG signals-to a graph to study complexity and fractality of the time series through investigation of its complexity. The power of scale-freeness of visibility graph has been used as an effective method for measuring fractality in the EEG signal. The scale-freeness of the visibility graph has also been observed after averaging the statistically independent samples of the signal. Scale-freeness of the visibility graph has been calculated for 5 sets of EEG data patterns varying from normal eye closed to epileptic. The change in the values is analyzed further, and it has been observed that it reduces uniformly from normal eye closed to epileptic. © EEG and Clinical Neuroscience Society (ECNS) 2014.

  4. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  5. Identifying Effective Spelling Interventions Using a Brief Experimental Analysis and Extended Analysis

    Science.gov (United States)

    McCurdy, Merilee; Clure, Lynne F.; Bleck, Amanda A.; Schmitz, Stephanie L.

    2016-01-01

    Spelling is an important skill that is crucial to effective written communication. In this study, brief experimental analysis procedures were used to examine spelling instruction strategies (e.g., whole word correction; word study strategy; positive practice; and cover, copy, and compare) for four students. In addition, an extended analysis was…

  6. COMPARISON AND ANALYSIS OF VARIOUS HISTOGRAM EQUALIZATION TECHNIQUES

    OpenAIRE

    MADKI.M.R; RUBINA KHAN

    2012-01-01

    The intensity histogram gives information which can be used for contrast enhancement. The histogram equalization could be flat for levels less than the total number of levels. This could deteriorate the image. This problem can be overcome various techniques. This paper gives a comparative of the Bi-Histogram Equalization, Recursive Mean Seperated Histogram Equalization, Multipeak Histogram Equalization and Brightness Preserving Dynamic Histogram Equalization techniques by using these techniqu...

  7. Application of different techniques to identify the effects of irradiation on Brazilian beans after six months storage

    Energy Technology Data Exchange (ETDEWEB)

    Villavicencio, A.L.C.H.; Mancini-Filho, J.; Delincee, H

    1998-06-01

    Four different techniques to detect the effect of irradiation in beans were investigated. Two types of Brazilian beans, Phaseolus vulgaris L., var. carioca and Vigna unguiculata (L.) Walp, var. macacar, were irradiated using a {sup 60}Co source with doses ranging from 0, 1.0 to 10.0 kGy. After 6 months storage at ambient temperature the detection tests were carried out. Firstly, germination tests showed markedly reduced root growth and almost totally retarded shoot elongation of irradiated beans as compared to non-irradiated beans. Secondly, DNA fragmentation was studied using a microgel electrophoresis. Irradiated cells produced typical comets with DNA fragments migrating towards the anode. DNA of non-irradiated cells exhibited a limited migration. Thirdly, electron spin resonance for detection of cellulose radicals was tested, since it was expected that these free radicals are quite stable in solid and dry foods. However, only in beans irradiated with 10 kGy a small signal could be detected. Fourthly, thermoluminescence, a method to analyze mineral debris adhering to food, turned out to be a good choice to detect irradiation effects in beans, even after 6 months of storage. The results indicate that three of these four techniques proposed, can be used to detect the effect of irradiation in these two varieties of Brazilian beans at a dose level useful for insect disinfestation (1 kGy)

  8. Application of different techniques to identify the effects of irradiation on Brazilian beans after six months storage

    International Nuclear Information System (INIS)

    Villavicencio, A.L.C.H.; Mancini-Filho, J.; Delincee, H.

    1998-01-01

    Four different techniques to detect the effect of irradiation in beans were investigated. Two types of Brazilian beans, Phaseolus vulgaris L., var. carioca and Vigna unguiculata (L.) Walp, var. macacar, were irradiated using a 60 Co source with doses ranging from 0, 1.0 to 10.0 kGy. After 6 months storage at ambient temperature the detection tests were carried out. Firstly, germination tests showed markedly reduced root growth and almost totally retarded shoot elongation of irradiated beans as compared to non-irradiated beans. Secondly, DNA fragmentation was studied using a microgel electrophoresis. Irradiated cells produced typical comets with DNA fragments migrating towards the anode. DNA of non-irradiated cells exhibited a limited migration. Thirdly, electron spin resonance for detection of cellulose radicals was tested, since it was expected that these free radicals are quite stable in solid and dry foods. However, only in beans irradiated with 10 kGy a small signal could be detected. Fourthly, thermoluminescence, a method to analyze mineral debris adhering to food, turned out to be a good choice to detect irradiation effects in beans, even after 6 months of storage. The results indicate that three of these four techniques proposed, can be used to detect the effect of irradiation in these two varieties of Brazilian beans at a dose level useful for insect disinfestation (1 kGy)

  9. Comparative analysis of Salmonella genomes identifies a metabolic network for escalating growth in the inflamed gut.

    Science.gov (United States)

    Nuccio, Sean-Paul; Bäumler, Andreas J

    2014-03-18

    The Salmonella genus comprises a group of pathogens associated with illnesses ranging from gastroenteritis to typhoid fever. We performed an in silico analysis of comparatively reannotated Salmonella genomes to identify genomic signatures indicative of disease potential. By removing numerous annotation inconsistencies and inaccuracies, the process of reannotation identified a network of 469 genes involved in central anaerobic metabolism, which was intact in genomes of gastrointestinal pathogens but degrading in genomes of extraintestinal pathogens. This large network contained pathways that enable gastrointestinal pathogens to utilize inflammation-derived nutrients as well as many of the biochemical reactions used for the enrichment and biochemical discrimination of Salmonella serovars. Thus, comparative genome analysis identifies a metabolic network that provides clues about the strategies for nutrient acquisition and utilization that are characteristic of gastrointestinal pathogens. IMPORTANCE While some Salmonella serovars cause infections that remain localized to the gut, others disseminate throughout the body. Here, we compared Salmonella genomes to identify characteristics that distinguish gastrointestinal from extraintestinal pathogens. We identified a large metabolic network that is functional in gastrointestinal pathogens but decaying in extraintestinal pathogens. While taxonomists have used traits from this network empirically for many decades for the enrichment and biochemical discrimination of Salmonella serovars, our findings suggest that it is part of a "business plan" for growth in the inflamed gastrointestinal tract. By identifying a large metabolic network characteristic of Salmonella serovars associated with gastroenteritis, our in silico analysis provides a blueprint for potential strategies to utilize inflammation-derived nutrients and edge out competing gut microbes.

  10. A parameter estimation and identifiability analysis methodology applied to a street canyon air pollution model

    DEFF Research Database (Denmark)

    Ottosen, T. B.; Ketzel, Matthias; Skov, H.

    2016-01-01

    Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street...... of the identifiability analysis, showed that some model parameters were significantly more sensitive than others. The application of the determined optimal parameter values was shown to successfully equilibrate the model biases among the individual streets and species. It was as well shown that the frequentist approach...

  11. Combining bulk sediment OSL and meteoric 10 Be fingerprinting techniques to identify gully initiation sites and erosion depths

    OpenAIRE

    Portenga, E.W.; Bishop, P.; Rood, D.H.; Bierman, P.R.

    2017-01-01

    Deep erosional gullies dissect landscapes around the world. Existing erosion models focus on predicting where gullies might begin to erode, but identifying where existing gullies were initiated and under what conditions is difficult, especially when historical records are unavailable. Here we outline a new approach for fingerprinting alluvium and tracing it back to its source by combining bulk sediment optically stimulated luminescence (bulk OSL) and meteoric 10Be (10Bem) measurements made on...

  12. Comparative Analysis of the Dark Ground Buffy Coat Technique (DG ...

    African Journals Online (AJOL)

    The prevalence of typanosome infection in 65 cattle reared under expensive system of management was determined using the dark ground buffy coat (DG) technique and the enzyme-linkedimmunisorbent assay (ELISA). The DG technique showed that there were 18 positive cases (27.69%) of total number of animals, made ...

  13. Advanced patch-clamp techniques and single-channel analysis

    NARCIS (Netherlands)

    Biskup, B; Elzenga, JTM; Homann, U; Thiel, G; Wissing, F; Maathuis, FJM

    Much of our knowledge of ion-transport mechanisms in plant cell membranes comes from experiments using voltage-clamp. This technique allows the measurement of ionic currents across the membrane, whilst the voltage is held under experimental control. The patch-clamp technique was developed to study

  14. ERROR ANALYSIS FOR THE AIRBORNE DIRECT GEOREFERINCING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    A. S. Elsharkawy

    2016-10-01

    Full Text Available Direct Georeferencing was shown to be an important alternative to standard indirect image orientation using classical or GPS-supported aerial triangulation. Since direct Georeferencing without ground control relies on an extrapolation process only, particular focus has to be laid on the overall system calibration procedure. The accuracy performance of integrated GPS/inertial systems for direct Georeferencing in airborne photogrammetric environments has been tested extensively in the last years. In this approach, the limiting factor is a correct overall system calibration including the GPS/inertial component as well as the imaging sensor itself. Therefore remaining errors in the system calibration will significantly decrease the quality of object point determination. This research paper presents an error analysis for the airborne direct Georeferencing technique, where integrated GPS/IMU positioning and navigation systems are used, in conjunction with aerial cameras for airborne mapping compared with GPS/INS supported AT through the implementation of certain amount of error on the EOP and Boresight parameters and study the effect of these errors on the final ground coordinates. The data set is a block of images consists of 32 images distributed over six flight lines, the interior orientation parameters, IOP, are known through careful camera calibration procedure, also 37 ground control points are known through terrestrial surveying procedure. The exact location of camera station at time of exposure, exterior orientation parameters, EOP, is known through GPS/INS integration process. The preliminary results show that firstly, the DG and GPS-supported AT have similar accuracy and comparing with the conventional aerial photography method, the two technologies reduces the dependence on ground control (used only for quality control purposes. Secondly, In the DG Correcting overall system calibration including the GPS/inertial component as well as the

  15. Qualitative analysis of Orzooiyeh plain groundwater resources using GIS techniques

    Directory of Open Access Journals (Sweden)

    Mohsen Pourkhosravani

    2016-09-01

    Full Text Available Background: Unsustainable development of human societies, especially in arid and semi-arid areas, is one of the most important environmental hazards that require preservation of groundwater resources, and permanent study of qualitative and quantitative changes through sampling. Accordingly, this research attempts to assess and analyze the spatial variation of quantitative and qualitative indicators of Orzooiyeh groundwater resources in the Kerman province by using the geographic information system (GIS. Methods: This study attempts to survey the spatial analysis of these indexes using GIS techniques besides the evaluation of the groundwater resources quality in the study area. For this purpose, data quality indicators and statistics such as electrical conductivity, pH, sulphate, residual total dissolved solids (TDS, sodium, calcium; magnesium and chlorine of 28 selected wells sampled by the Kerman regional water organization were used. Results: A comparison of the present research results with standard of Industrial Research of Iran and also the World Health Organization (WHO shows that, among the measured indices, the electrical conductivity and TDS in the chosen samples are higher than the national standard of Iran and of the WHO but other indices are more favourable. Conclusion: Results showed that the electrical conductivity index of 64.3% of the samples have an optimal level, 71.4% have the limit of Iran national standard and only 3.6% of them have the WHO standard. The TDS index, too, did not reach national standards in any of the samples and in 82.1% of the samples this index was on the national standard limit. As per this index, only 32.1% of the samples were in the WHO standards.

  16. Quantitative Analysis of Technological Innovation in Knee Arthroplasty: Using Patent and Publication Metrics to Identify Developments and Trends.

    Science.gov (United States)

    Dalton, David M; Burke, Thomas P; Kelly, Enda G; Curtin, Paul D

    2016-06-01

    Surgery is in a constant continuum of innovation with refinement of technique and instrumentation. Arthroplasty surgery potentially represents an area with highly innovative process. This study highlights key area of innovation in knee arthroplasty over the past 35 years using patent and publication metrics. Growth rates and patterns are analyzed. Patents are correlated to publications as a measure of scientific support. Electronic patent and publication databases were searched over the interval 1980-2014 for "knee arthroplasty" OR "knee replacement." The resulting patent codes were allocated into technology clusters. Citation analysis was performed to identify any important developments missed on initial analysis. The technology clusters identified were further analyzed, individual repeat searches performed, and growth curves plotted. The initial search revealed 3574 patents and 16,552 publications. The largest technology clusters identified were Unicompartmental, Patient-Specific Instrumentation (PSI), Navigation, and Robotic knee arthroplasties. The growth in patent activity correlated strongly with publication activity (Pearson correlation value 0.892, P technology in the last 5 years, is currently in a period of exponential growth that began a decade ago. Established technologies in the study have double s-shaped patent curves. Identifying trends in emerging technologies is possible using patent metrics and is useful information for training and regulatory bodies. The decline in ratio of publications to patents and the uninterrupted growth of PSI are developments that may warrant further investigation. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Identifying Subgroups of Tinnitus Using Novel Resting State fMRI Biomarkers and Cluster Analysis

    Science.gov (United States)

    2017-10-13

    applied to the resting-state data to identify tinnitus subgroups within the patient population and pair them with specific behavioral ...and behavioral data  Specific Aim 2: Determine tinnitus subgroups using automated cluster analysis of resting state data and associate the subgroups...data analysis and clustering method previously developed to apply to current tinnitus data set o Percentage of completion at end of Year 2 (24 months

  18. Analysis of promoter regions of co-expressed genes identified by microarray analysis

    Directory of Open Access Journals (Sweden)

    Höglund Mattias

    2006-08-01

    Full Text Available Abstract Background The use of global gene expression profiling to identify sets of genes with similar expression patterns is rapidly becoming a widespread approach for understanding biological processes. A logical and systematic approach to study co-expressed genes is to analyze their promoter sequences to identify transcription factors that may be involved in establishing specific profiles and that may be experimentally investigated. Results We introduce promoter clustering i.e. grouping of promoters with respect to their high scoring motif content, and show that this approach greatly enhances the identification of common and significant transcription factor binding sites (TFBS in co-expressed genes. We apply this method to two different dataset, one consisting of micro array data from 108 leukemias (AMLs and a second from a time series experiment, and show that biologically relevant promoter patterns may be obtained using phylogenetic foot-printing methodology. In addition, we also found that 15% of the analyzed promoter regions contained transcription factors start sites for additional genes transcribed in the opposite direction. Conclusion Promoter clustering based on global promoter features greatly improve the identification of shared TFBS in co-expressed genes. We believe that the outlined approach may be a useful first step to identify transcription factors that contribute to specific features of gene expression profiles.

  19. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    Science.gov (United States)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  20. Computed Tomography Fractional Flow Reserve Can Identify Culprit Lesions in Aortoiliac Occlusive Disease Using Minimally Invasive Techniques.

    Science.gov (United States)

    Ward, Erin P; Shiavazzi, Daniele; Sood, Divya; Marsden, Allison; Lane, John; Owens, Erik; Barleben, Andrew

    2017-01-01

    Currently, the gold standard diagnostic examination for significant aortoiliac lesions is angiography. Fractional flow reserve (FFR) has a growing body of literature in coronary artery disease as a minimally invasive diagnostic procedure. Improvements in numerical hemodynamics have allowed for an accurate and minimally invasive approach to estimating FFR, utilizing cross-sectional imaging. We aim to demonstrate a similar approach to aortoiliac occlusive disease (AIOD). A retrospective review evaluated 7 patients with claudication and cross-sectional imaging showing AIOD. FFR was subsequently measured during conventional angiogram with pull-back pressures in a retrograde fashion. To estimate computed tomography (CT) FFR, CT angiography (CTA) image data were analyzed using the SimVascular software suite to create a computational fluid dynamics model of the aortoiliac system. Inlet flow conditions were derived based on cardiac output, while 3-element Windkessel outlet boundary conditions were optimized to match the expected systolic and diastolic pressures, with outlet resistance distributed based on Murray's law. The data were evaluated with a Student's t-test and receiver operating characteristic curve. All patients had evidence of AIOD on CT and FFR was successfully measured during angiography. The modeled data were found to have high sensitivity and specificity between the measured and CT FFR (P = 0.986, area under the curve = 1). The average difference between the measured and calculated FFRs was 0.136, with a range from 0.03 to 0.30. CT FFR successfully identified aortoiliac lesions with significant pressure drops that were identified with angiographically measured FFR. CT FFR has the potential to provide a minimally invasive approach to identify flow-limiting stenosis for AIOD. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  2. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  3. Sparse canonical correlation analysis for identifying, connecting and completing gene-expression networks

    Directory of Open Access Journals (Sweden)

    Zwinderman Aeilko H

    2009-09-01

    Full Text Available Abstract Background We generalized penalized canonical correlation analysis for analyzing microarray gene-expression measurements for checking completeness of known metabolic pathways and identifying candidate genes for incorporation in the pathway. We used Wold's method for calculation of the canonical variates, and we applied ridge penalization to the regression of pathway genes on canonical variates of the non-pathway genes, and the elastic net to the regression of non-pathway genes on the canonical variates of the pathway genes. Results We performed a small simulation to illustrate the model's capability to identify new candidate genes to incorporate in the pathway: in our simulations it appeared that a gene was correctly identified if the correlation with the pathway genes was 0.3 or more. We applied the methods to a gene-expression microarray data set of 12, 209 genes measured in 45 patients with glioblastoma, and we considered genes to incorporate in the glioma-pathway: we identified more than 25 genes that correlated > 0.9 with canonical variates of the pathway genes. Conclusion We concluded that penalized canonical correlation analysis is a powerful tool to identify candidate genes in pathway analysis.

  4. Systematic In Vivo RNAi Analysis Identifies IAPs as NEDD8-E3 Ligases

    DEFF Research Database (Denmark)

    Broemer, Meike; Tenev, Tencho; Rigbolt, Kristoffer T G

    2010-01-01

    -like proteins (UBLs), and deconjugating enzymes that remove the Ub or UBL adduct. Systematic in vivo RNAi analysis identified three NEDD8-specific isopeptidases that, when knocked down, suppress apoptosis. Consistent with the notion that attachment of NEDD8 prevents cell death, genetic ablation of deneddylase 1...

  5. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    NARCIS (Netherlands)

    Voight, Benjamin F.; Scott, Laura J.; Steinthorsdottir, Valgerdur; Morris, Andrew P.; Dina, Christian; Welch, Ryan P.; Zeggini, Eleftheria; Huth, Cornelia; Aulchenko, Yurii S.; Thorleifsson, Gudmar; McCulloch, Laura J.; Ferreira, Teresa; Grallert, Harald; Amin, Najaf; Wu, Guanming; Willer, Cristen J.; Raychaudhuri, Soumya; McCarroll, Steve A.; Langenberg, Claudia; Hofmann, Oliver M.; Dupuis, Josee; Qi, Lu; Segre, Ayellet V.; van Hoek, Mandy; Navarro, Pau; Ardlie, Kristin; Balkau, Beverley; Benediktsson, Rafn; Bennett, Amanda J.; Blagieva, Roza; Boerwinkle, Eric; Bonnycastle, Lori L.; Bostrom, Kristina Bengtsson; Bravenboer, Bert; Bumpstead, Suzannah; Burtt, Noisel P.; Charpentier, Guillaume; Chines, Peter S.; Cornelis, Marilyn; Couper, David J.; Crawford, Gabe; Doney, Alex S. F.; Elliott, Katherine S.; Elliott, Amanda L.; Erdos, Michael R.; Fox, Caroline S.; Franklin, Christopher S.; Ganser, Martha; Gieger, Christian; Grarup, Niels; Green, Todd; Griffin, Simon; Groves, Christopher J.; Guiducci, Candace; Hadjadj, Samy; Hassanali, Neelam; Herder, Christian; Isomaa, Bo; Jackson, Anne U.; Johnson, Paul R. V.; Jorgensen, Torben; Kao, Wen H. L.; Klopp, Norman; Kong, Augustine; Kraft, Peter; Kuusisto, Johanna; Lauritzen, Torsten; Li, Man; Lieverse, Aloysius; Lindgren, Cecilia M.; Lyssenko, Valeriya; Marre, Michel; Meitinger, Thomas; Midthjell, Kristian; Morken, Mario A.; Narisu, Narisu; Nilsson, Peter; Owen, Katharine R.; Payne, Felicity; Perry, John R. B.; Petersen, Ann-Kristin; Platou, Carl; Proenca, Christine; Prokopenko, Inga; Rathmann, Wolfgang; Rayner, N. William; Robertson, Neil R.; Rocheleau, Ghislain; Roden, Michael; Sampson, Michael J.; Saxena, Richa; Shields, Beverley M.; Shrader, Peter; Sigurdsson, Gunnar; Sparso, Thomas; Strassburger, Klaus; Stringham, Heather M.; Sun, Qi; Swift, Amy J.; Thorand, Barbara; Tichet, Jean; Tuomi, Tiinamaija; van Dam, Rob M.; van Haeften, Timon W.; van Herpt, Thijs; van Vliet-Ostaptchouk, Jana V.; Walters, G. Bragi; Weedon, Michael N.; Wijmenga, Cisca; Witteman, Jacqueline; Bergman, Richard N.; Cauchi, Stephane; Collins, Francis S.; Gloyn, Anna L.; Gyllensten, Ulf; Hansen, Torben; Hide, Winston A.; Hitman, Graham A.; Hofman, Albert; Hunter, David J.; Hveem, Kristian; Laakso, Markku; Mohlke, Karen L.; Morris, Andrew D.; Palmer, Colin N. A.; Pramstaller, Peter P.; Rudan, Igor; Sijbrands, Eric; Stein, Lincoln D.; Tuomilehto, Jaakko; Uitterlinden, Andre; Walker, Mark; Wareham, Nicholas J.; Watanabe, Richard M.; Abecasis, Goncalo R.; Boehm, Bernhard O.; Campbell, Harry; Daly, Mark J.; Hattersley, Andrew T.; Hu, Frank B.; Meigs, James B.; Pankow, James S.; Pedersen, Oluf; Wichmann, H-Erich; Barroso, Ines; Florez, Jose C.; Frayling, Timothy M.; Groop, Leif; Sladek, Rob; Thorsteinsdottir, Unnur; Wilson, James F.; Illig, Thomas; Froguel, Philippe; van Duijn, Cornelia M.; Stefansson, Kari; Altshuler, David; Boehnke, Michael; McCarthy, Mark I.

    By combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals with combined

  6. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    NARCIS (Netherlands)

    B.F. Voight (Benjamin); L.J. Scott (Laura); V. Steinthorsdottir (Valgerdur); A.D. Morris (Andrew); C. Dina (Christian); R.P. Welch (Ryan); E. Zeggini (Eleftheria); C. Huth (Cornelia); Y.S. Aulchenko (Yurii); G. Thorleifsson (Gudmar); L.J. McCulloch (Laura); T. Ferreira (Teresa); H. Grallert (Harald); N. Amin (Najaf); G. Wu (Guanming); C.J. Willer (Cristen); S. Raychaudhuri (Soumya); S.A. McCarroll (Steven); C. Langenberg (Claudia); O.M. Hofmann (Oliver); J. Dupuis (Josée); L. Qi (Lu); A.V. Segrè (Ayellet); M. van Hoek (Mandy); P. Navarro (Pau); K.G. Ardlie (Kristin); B. Balkau (Beverley); R. Benediktsson (Rafn); A.J. Bennett (Amanda); R. Blagieva (Roza); E.A. Boerwinkle (Eric); L.L. Bonnycastle (Lori); K.B. Boström (Kristina Bengtsson); B. Bravenboer (Bert); S. Bumpstead (Suzannah); N.P. Burtt (Noël); G. Charpentier (Guillaume); P.S. Chines (Peter); M. Cornelis (Marilyn); D.J. Couper (David); G. Crawford (Gabe); A.S.F. Doney (Alex); K.S. Elliott (Katherine); M.R. Erdos (Michael); C.S. Fox (Caroline); C.S. Franklin (Christopher); M. Ganser (Martha); C. Gieger (Christian); N. Grarup (Niels); T. Green (Todd); S. Griffin (Simon); C.J. Groves (Christopher); C. Guiducci (Candace); S. Hadjadj (Samy); N. Hassanali (Neelam); C. Herder (Christian); B. Isomaa (Bo); A.U. Jackson (Anne); P.R.V. Johnson (Paul); T. Jørgensen (Torben); W.H.L. Kao (Wen); N. Klopp (Norman); A. Kong (Augustine); P. Kraft (Peter); J. Kuusisto (Johanna); T. Lauritzen (Torsten); M. Li (Man); A. Lieverse (Aloysius); C.M. Lindgren (Cecilia); V. Lyssenko (Valeriya); M. Marre (Michel); T. Meitinger (Thomas); K. Midthjell (Kristian); M.A. Morken (Mario); N. Narisu (Narisu); P. Nilsson (Peter); K.R. Owen (Katharine); F. Payne (Felicity); J.R.B. Perry (John); A.K. Petersen; C. Platou (Carl); C. Proença (Christine); I. Prokopenko (Inga); W. Rathmann (Wolfgang); N.W. Rayner (Nigel William); N.R. Robertson (Neil); G. Rocheleau (Ghislain); M. Roden (Michael); M.J. Sampson (Michael); R. Saxena (Richa); B.M. Shields (Beverley); P. Shrader (Peter); G. Sigurdsson (Gunnar); T. Sparsø (Thomas); K. Strassburger (Klaus); H.M. Stringham (Heather); Q. Sun (Qi); A.J. Swift (Amy); B. Thorand (Barbara); J. Tichet (Jean); T. Tuomi (Tiinamaija); R.M. van Dam (Rob); T.W. van Haeften (Timon); T.W. van Herpt (Thijs); J.V. van Vliet-Ostaptchouk (Jana); G.B. Walters (Bragi); M.N. Weedon (Michael); C. Wijmenga (Cisca); J.C.M. Witteman (Jacqueline); R.N. Bergman (Richard); S. Cauchi (Stephane); F.S. Collins (Francis); A.L. Gloyn (Anna); U. Gyllensten (Ulf); T. Hansen (Torben); W.A. Hide (Winston); G.A. Hitman (Graham); A. Hofman (Albert); D. Hunter (David); K. Hveem (Kristian); M. Laakso (Markku); K.L. Mohlke (Karen); C.N.A. Palmer (Colin); P.P. Pramstaller (Peter Paul); I. Rudan (Igor); E.J.G. Sijbrands (Eric); L.D. Stein (Lincoln); J. Tuomilehto (Jaakko); A.G. Uitterlinden (André); M. Walker (Mark); N.J. Wareham (Nick); G.R. Abecasis (Gonçalo); B.O. Boehm (Bernhard); H. Campbell (Harry); M.J. Daly (Mark); A.T. Hattersley (Andrew); F.B. Hu (Frank); J.B. Meigs (James); J.S. Pankow (James); O. Pedersen (Oluf); H.E. Wichmann (Erich); I. Barroso (Inês); J.C. Florez (Jose); T.M. Frayling (Timothy); L. Groop (Leif); R. Sladek (Rob); U. Thorsteinsdottir (Unnur); J.F. Wilson (James); T. Illig (Thomas); P. Froguel (Philippe); P. Tikka-Kleemola (Päivi); J-A. Zwart (John-Anker); D. Altshuler (David); M. Boehnke (Michael); M.I. McCarthy (Mark); R.M. Watanabe (Richard)

    2010-01-01

    textabstractBy combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals

  7. Identifying Skill Requirements for GIS Positions: A Content Analysis of Job Advertisements

    Science.gov (United States)

    Hong, Jung Eun

    2016-01-01

    This study identifies the skill requirements for geographic information system (GIS) positions, including GIS analysts, programmers/developers/engineers, specialists, and technicians, through a content analysis of 946 GIS job advertisements from 2007-2014. The results indicated that GIS job applicants need to possess high levels of GIS analysis…

  8. Exome-wide rare variant analysis identifies TUBA4A mutations associated with familial ALS

    NARCIS (Netherlands)

    Smith, Bradley N.; Ticozzi, Nicola; Fallini, Claudia; Gkazi, Athina Soragia; Topp, Simon; Kenna, Kevin P.; Scotter, Emma L.; Kost, Jason; Keagle, Pamela; Miller, Jack W.; Calini, Daniela; Vance, Caroline; Danielson, Eric W.; Troakes, Claire; Tiloca, Cinzia; Al-Sarraj, Safa; Lewis, Elizabeth A.; King, Andrew; Colombrita, Claudia; Pensato, Viviana; Castellotti, Barbara; de Belleroche, Jacqueline; Baas, Frank; ten Asbroek, Anneloor L. M. A.; Sapp, Peter C.; McKenna-Yasek, Diane; McLaughlin, Russell L.; Polak, Meraida; Asress, Seneshaw; Esteban-Pérez, Jesús; Muñoz-Blanco, José Luis; Simpson, Michael; van Rheenen, Wouter; Diekstra, Frank P.; Lauria, Giuseppe; Duga, Stefano; Corti, Stefania; Cereda, Cristina; Corrado, Lucia; Sorarù, Gianni; Morrison, Karen E.; Williams, Kelly L.; Nicholson, Garth A.; Blair, Ian P.; Dion, Patrick A.; Leblond, Claire S.; Rouleau, Guy A.; Hardiman, Orla; Veldink, Jan H.; van den Berg, Leonard H.; Al-Chalabi, Ammar; Pall, Hardev; Shaw, Pamela J.; Turner, Martin R.; Talbot, Kevin; Taroni, Franco; García-Redondo, Alberto; Wu, Zheyang; Glass, Jonathan D.; Gellera, Cinzia; Ratti, Antonia; Brown, Robert H.; Silani, Vincenzo; Shaw, Christopher E.; Landers, John E.; D'alfonso, Sandra; Mazzini, Letizia; Comi, Giacomo P.; del Bo, Roberto; Ceroni, Mauro; Gagliardi, Stella; Querin, Giorgia; Bertolin, Cinzia

    2014-01-01

    Exome sequencing is an effective strategy for identifying human disease genes. However, this methodology is difficult in late-onset diseases where limited availability of DNA from informative family members prohibits comprehensive segregation analysis. To overcome this limitation, we performed an

  9. Social Network Analysis: A Simple but Powerful Tool for Identifying Teacher Leaders

    Science.gov (United States)

    Smith, P. Sean; Trygstad, Peggy J.; Hayes, Meredith L.

    2018-01-01

    Instructional teacher leadership is central to a vision of distributed leadership. However, identifying instructional teacher leaders can be a daunting task, particularly for administrators who find themselves either newly appointed or faced with high staff turnover. This article describes the use of social network analysis (SNA), a simple but…

  10. Genome-based exome sequencing analysis identifies GYG1, DIS3L ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 96; Issue 6. Genome-based exome sequencing analysis identifies GYG1, DIS3L and DDRGK1 are associated with myocardial infarction in Koreans. JI-YOUNG LEE SANGHOON MOON YUN KYOUNG KIM SANG-HAK LEE BOK-SOO LEE MIN-YOUNG PARK JEONG EUY PARK ...

  11. Are Young Dual Language Learners Homogeneous? Identifying Subgroups Using Latent Class Analysis

    Science.gov (United States)

    Kim, Do-Hong; Lambert, Richard G.; Burts, Diane C.

    2018-01-01

    Although dual language learners (DLLs) are linguistically, culturally, and socially diverse, researchers usually study them in aggregate and compare them to non-DLLs. The authors' purpose was to identify subgroups of preschool DLLs using latent class analysis. There were 7,361 DLLs and 69,457 non-DLLs. Results revealed three distinct classes.…

  12. Using Latent Class Analysis to Identify Academic and Behavioral Risk Status in Elementary Students

    Science.gov (United States)

    King, Kathleen R.; Lembke, Erica S.; Reinke, Wendy M.

    2016-01-01

    Identifying classes of children on the basis of academic and behavior risk may have important implications for the allocation of intervention resources within Response to Intervention (RTI) and Multi-Tiered System of Support (MTSS) models. Latent class analysis (LCA) was conducted with a sample of 517 third grade students. Fall screening scores in…

  13. Genome-wide association scan meta-analysis identifies three loci influencing adiposity and fat distribution

    NARCIS (Netherlands)

    C.M. Lindgren (Cecilia); I.M. Heid (Iris); J.C. Randall (Joshua); C. Lamina (Claudia); V. Steinthorsdottir (Valgerdur); L. Qi (Lu); E.K. Speliotes (Elizabeth); G. Thorleifsson (Gudmar); C.J. Willer (Cristen); B.M. Herrera (Blanca); A.U. Jackson (Anne); N. Lim (Noha); P. Scheet (Paul); N. Soranzo (Nicole); N. Amin (Najaf); Y.S. Aulchenko (Yurii); J.C. Chambers (John); A. Drong (Alexander); J. Luan; H.N. Lyon (Helen); F. Rivadeneira Ramirez (Fernando); S. Sanna (Serena); N.J. Timpson (Nicholas); M.C. Zillikens (Carola); H.Z. Jing; P. Almgren (Peter); S. Bandinelli (Stefania); A.J. Bennett (Amanda); R.N. Bergman (Richard); L.L. Bonnycastle (Lori); S. Bumpstead (Suzannah); S.J. Chanock (Stephen); L. Cherkas (Lynn); P.S. Chines (Peter); L. Coin (Lachlan); C. Cooper (Charles); G. Crawford (Gabe); A. Doering (Angela); A. Dominiczak (Anna); A.S.F. Doney (Alex); S. Ebrahim (Shanil); P. Elliott (Paul); M.R. Erdos (Michael); K. Estrada Gil (Karol); L. Ferrucci (Luigi); G. Fischer (Guido); N.G. Forouhi (Nita); C. Gieger (Christian); H. Grallert (Harald); C.J. Groves (Christopher); S.M. Grundy (Scott); C. Guiducci (Candace); D. Hadley (David); A. Hamsten (Anders); A.S. Havulinna (Aki); A. Hofman (Albert); R. Holle (Rolf); J.W. Holloway (John); T. Illig (Thomas); B. Isomaa (Bo); L.C. Jacobs (Leonie); K. Jameson (Karen); P. Jousilahti (Pekka); F. Karpe (Fredrik); J. Kuusisto (Johanna); J. Laitinen (Jaana); G.M. Lathrop (Mark); D.A. Lawlor (Debbie); M. Mangino (Massimo); W.L. McArdle (Wendy); T. Meitinger (Thomas); M.A. Morken (Mario); A.P. Morris (Andrew); P. Munroe (Patricia); N. Narisu (Narisu); A. Nordström (Anna); B.A. Oostra (Ben); C.N.A. Palmer (Colin); F. Payne (Felicity); J. Peden (John); I. Prokopenko (Inga); F. Renström (Frida); A. Ruokonen (Aimo); V. Salomaa (Veikko); M.S. Sandhu (Manjinder); L.J. Scott (Laura); A. Scuteri (Angelo); K. Silander (Kaisa); K. Song (Kijoung); X. Yuan (Xin); H.M. Stringham (Heather); A.J. Swift (Amy); T. Tuomi (Tiinamaija); M. Uda (Manuela); P. Vollenweider (Peter); G. Waeber (Gérard); C. Wallace (Chris); G.B. Walters (Bragi); M.N. Weedon (Michael); J.C.M. Witteman (Jacqueline); C. Zhang (Cuilin); M. Caulfield (Mark); F.S. Collins (Francis); G.D. Smith; I.N.M. Day (Ian); P.W. Franks (Paul); A.T. Hattersley (Andrew); F.B. Hu (Frank); M.-R. Jarvelin (Marjo-Riitta); A. Kong (Augustine); J.S. Kooner (Jaspal); M. Laakso (Markku); E. Lakatta (Edward); V. Mooser (Vincent); L. Peltonen (Leena Johanna); N.J. Samani (Nilesh); T.D. Spector (Timothy); D.P. Strachan (David); T. Tanaka (Toshiko); J. Tuomilehto (Jaakko); A.G. Uitterlinden (André); P. Tikka-Kleemola (Päivi); N.J. Wareham (Nick); H. Watkins (Hugh); D. Waterworth (Dawn); M. Boehnke (Michael); P. Deloukas (Panagiotis); L. Groop (Leif); D.J. Hunter (David); U. Thorsteinsdottir (Unnur); D. Schlessinger (David); H.E. Wichmann (Erich); T.M. Frayling (Timothy); G.R. Abecasis (Gonçalo); J.N. Hirschhorn (Joel); R.J.F. Loos (Ruth); J-A. Zwart (John-Anker); K.L. Mohlke (Karen); I. Barroso (Inês); M.I. McCarthy (Mark)

    2009-01-01

    textabstractTo identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist-hip ratio (WHR). We selected 26 SNPs for follow-up, for which the

  14. AcuI identifies water buffalo CSN3 genotypes by RFLP analysis

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 93; Online resources. AcuI identifies water buffalo CSN3 genotypes by RFLP analysis. Soheir M. El Nahas Ahlam A. Abou Mossallam. Volume 93 Online resources 2014 pp e94-e96. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Bioinformatics analysis identifies several intrinsically disordered human E3 ubiquitin-protein ligases

    DEFF Research Database (Denmark)

    Boomsma, Wouter Krogh; Nielsen, Sofie Vincents; Lindorff-Larsen, Kresten

    2016-01-01

    conduct a bioinformatics analysis to examine >600 human and S. cerevisiae E3 ligases to identify enzymes that are similar to San1 in terms of function and/or mechanism of substrate recognition. An initial sequence-based database search was found to detect candidates primarily based on the homology...

  16. Genetic analysis to identify good combiners for ToLCV resistance ...

    Indian Academy of Sciences (India)

    2014-11-10

    Nov 10, 2014 ... RESEARCH ARTICLE. Genetic analysis to identify good combiners for ToLCV resistance and yield components in tomato using interspecific hybridization. RAMESH K. SINGH1,2,3, N. RAI1∗, MAJOR SINGH1, S. N. SINGH2 and K. SRIVASTAVA4. 1Crop Improvement Division, Indian Institute of Vegetable ...

  17. Identifying Subgroups of Tinnitus Using Novel Resting State fMRI Biomarkers and Cluster Analysis

    Science.gov (United States)

    2016-10-01

    project activities, for the purpose of enhancing public understanding and increasing interest in learning and careers in science, technology, and the... Unsupervised hierarchical clustering of resting state functional connectivity data to identify patients with mild tinnitus. Poster session presented...including drafting of IRB behavioral and scanning protocols, advising on recruiting and initial data collection. She also supervised analysis of data and

  18. Large-scale association analysis identifies new risk loci for coronary artery disease

    NARCIS (Netherlands)

    Deloukas, Panos; Kanoni, Stavroula; Willenborg, Christina; Farrall, Martin; Assimes, Themistocles L.; Thompson, John R.; Ingelsson, Erik; Saleheen, Danish; Erdmann, Jeanette; Goldstein, Benjamin A.; Stirrups, Kathleen; König, Inke R.; Cazier, Jean-Baptiste; Johansson, Asa; Hall, Alistair S.; Lee, Jong-Young; Willer, Cristen J.; Chambers, John C.; Esko, Tõnu; Folkersen, Lasse; Goel, Anuj; Grundberg, Elin; Havulinna, Aki S.; Ho, Weang K.; Hopewell, Jemma C.; Eriksson, Niclas; Kleber, Marcus E.; Kristiansson, Kati; Lundmark, Per; Lyytikäinen, Leo-Pekka; Rafelt, Suzanne; Shungin, Dmitry; Strawbridge, Rona J.; Thorleifsson, Gudmar; Tikkanen, Emmi; van Zuydam, Natalie; Voight, Benjamin F.; Waite, Lindsay L.; Zhang, Weihua; Ziegler, Andreas; Absher, Devin; Altshuler, David; Balmforth, Anthony J.; Barroso, Inês; Braund, Peter S.; Burgdorf, Christof; Claudi-Boehm, Simone; Cox, David; Dimitriou, Maria; Do, Ron; Doney, Alex S. F.; El Mokhtari, NourEddine; Eriksson, Per; Fischer, Krista; Fontanillas, Pierre; Franco-Cereceda, Anders; Gigante, Bruna; Groop, Leif; Gustafsson, Stefan; Hager, Jörg; Hallmans, Göran; Han, Bok-Ghee; Hunt, Sarah E.; Kang, Hyun M.; Illig, Thomas; Kessler, Thorsten; Knowles, Joshua W.; Kolovou, Genovefa; Kuusisto, Johanna; Langenberg, Claudia; Langford, Cordelia; Leander, Karin; Lokki, Marja-Liisa; Lundmark, Anders; McCarthy, Mark I.; Meisinger, Christa; Melander, Olle; Mihailov, Evelin; Maouche, Seraya; Morris, Andrew D.; Müller-Nurasyid, Martina; Nikus, Kjell; Peden, John F.; Rayner, N. William; Rasheed, Asif; Rosinger, Silke; Rubin, Diana; Rumpf, Moritz P.; Schäfer, Arne; Sivananthan, Mohan; Song, Ci; Stewart, Alexandre F. R.; Tan, Sian-Tsung; Thorgeirsson, Gudmundur; van der Schoot, C. Ellen; Wagner, Peter J.; Wells, George A.; Wild, Philipp S.; Yang, Tsun-Po; Amouyel, Philippe; Arveiler, Dominique; Basart, Hanneke; Boehnke, Michael; Boerwinkle, Eric; Brambilla, Paolo; Cambien, Francois; Cupples, Adrienne L.; de Faire, Ulf; Dehghan, Abbas; Diemert, Patrick; Epstein, Stephen E.; Evans, Alun; Ferrario, Marco M.; Ferrières, Jean; Gauguier, Dominique; Go, Alan S.; Goodall, Alison H.; Gudnason, Villi; Hazen, Stanley L.; Holm, Hilma; Iribarren, Carlos; Jang, Yangsoo; Kähönen, Mika; Kee, Frank; Kim, Hyo-Soo; Klopp, Norman; Koenig, Wolfgang; Kratzer, Wolfgang; Kuulasmaa, Kari; Laakso, Markku; Laaksonen, Reijo; Lee, Ji-Young; Lind, Lars; Ouwehand, Willem H.; Parish, Sarah; Park, Jeong E.; Pedersen, Nancy L.; Peters, Annette; Quertermous, Thomas; Rader, Daniel J.; Salomaa, Veikko; Schadt, Eric; Shah, Svati H.; Sinisalo, Juha; Stark, Klaus; Stefansson, Kari; Trégouët, David-Alexandre; Virtamo, Jarmo; Wallentin, Lars; Wareham, Nicholas; Zimmermann, Martina E.; Nieminen, Markku S.; Hengstenberg, Christian; Sandhu, Manjinder S.; Pastinen, Tomi; Syvänen, Ann-Christine; Hovingh, G. Kees; Dedoussis, George; Franks, Paul W.; Lehtimäki, Terho; Metspalu, Andres; Zalloua, Pierre A.; Siegbahn, Agneta; Schreiber, Stefan; Ripatti, Samuli; Blankenberg, Stefan S.; Perola, Markus; Clarke, Robert; Boehm, Bernhard O.; O'Donnell, Christopher; Reilly, Muredach P.; März, Winfried; Collins, Rory; Kathiresan, Sekar; Hamsten, Anders; Kooner, Jaspal S.; Thorsteinsdottir, Unnur; Danesh, John; Palmer, Colin N. A.; Roberts, Robert; Watkins, Hugh; Schunkert, Heribert; Samani, Nilesh J.

    2013-01-01

    Coronary artery disease (CAD) is the commonest cause of death. Here, we report an association analysis in 63,746 CAD cases and 130,681 controls identifying 15 loci reaching genome-wide significance, taking the number of susceptibility loci for CAD to 46, and a further 104 independent variants (r(2)

  19. Identifying sustainability issues using participatory SWOT analysis - A case study of egg production in the Netherlands

    NARCIS (Netherlands)

    Mollenhorst, H.; Boer, de I.J.M.

    2004-01-01

    The aim of this paper was to demonstrate how participatory strengths, weaknesses, opportunities and threats (SWOT) analysis can be used to identify relevant economic, ecological and societal (EES) issues for the assessment of sustainable development. This is illustrated by the case of egg production

  20. Identifying and prioritizing the tools/techniques of knowledge management based on the Asian Productivity Organization Model (APO) to use in hospitals.

    Science.gov (United States)

    Khajouei, Hamid; Khajouei, Reza

    2017-12-01

    Appropriate knowledge, correct information, and relevant data are vital in medical diagnosis and treatment systems. Knowledge Management (KM) through its tools/techniques provides a pertinent framework for decision-making in healthcare systems. The objective of this study was to identify and prioritize the KM tools/techniques that apply to hospital setting. This is a descriptive-survey study. Data were collected using a -researcher-made questionnaire that was developed based on experts' opinions to select the appropriate tools/techniques from 26 tools/techniques of the Asian Productivity Organization (APO) model. Questions were categorized into five steps of KM (identifying, creating, storing, sharing, and applying the knowledge) according to this model. The study population consisted of middle and senior managers of hospitals and managing directors of Vice-Chancellor for Curative Affairs in Kerman University of Medical Sciences in Kerman, Iran. The data were analyzed in SPSS v.19 using one-sample t-test. Twelve out of 26 tools/techniques of the APO model were identified as the tools applicable in hospitals. "Knowledge café" and "APO knowledge management assessment tool" with respective means of 4.23 and 3.7 were the most and the least applicable tools in the knowledge identification step. "Mentor-mentee scheme", as well as "voice and Voice over Internet Protocol (VOIP)" with respective means of 4.20 and 3.52 were the most and the least applicable tools/techniques in the knowledge creation step. "Knowledge café" and "voice and VOIP" with respective means of 3.85 and 3.42 were the most and the least applicable tools/techniques in the knowledge storage step. "Peer assist and 'voice and VOIP' with respective means of 4.14 and 3.38 were the most and the least applicable tools/techniques in the knowledge sharing step. Finally, "knowledge worker competency plan" and "knowledge portal" with respective means of 4.38 and 3.85 were the most and the least applicable tools/techniques

  1. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  2. Potential ligand-binding residues in rat olfactory receptors identified by correlated mutation analysis

    Science.gov (United States)

    Singer, M. S.; Oliveira, L.; Vriend, G.; Shepherd, G. M.

    1995-01-01

    A family of G-protein-coupled receptors is believed to mediate the recognition of odor molecules. In order to identify potential ligand-binding residues, we have applied correlated mutation analysis to receptor sequences from the rat. This method identifies pairs of sequence positions where residues remain conserved or mutate in tandem, thereby suggesting structural or functional importance. The analysis supported molecular modeling studies in suggesting several residues in positions that were consistent with ligand-binding function. Two of these positions, dominated by histidine residues, may play important roles in ligand binding and could confer broad specificity to mammalian odor receptors. The presence of positive (overdominant) selection at some of the identified positions provides additional evidence for roles in ligand binding. Higher-order groups of correlated residues were also observed. Each group may interact with an individual ligand determinant, and combinations of these groups may provide a multi-dimensional mechanism for receptor diversity.

  3. Monitoring early hydration of reinforced concrete structures using structural parameters identified by piezo sensors via electromechanical impedance technique

    Science.gov (United States)

    Talakokula, Visalakshi; Bhalla, Suresh; Gupta, Ashok

    2018-01-01

    Concrete is the most widely used material in civil engineering construction. Its life begins when the hydration process is activated after mixing the cement granulates with water. In this paper, a non-dimensional hydration parameter, obtained from piezoelectric ceramic (PZT) patches bonded to rebars embedded inside concrete, is employed to monitor the early age hydration of concrete. The non-dimensional hydration parameter is derived from the equivalent stiffness determined from the piezo-impedance transducers using the electro-mechanical impedance (EMI) technique. The focus of the study is to monitor the hydration process of cementitious materials commencing from the early hours and continue till 28 days using single non-dimensional parameter. The experimental results show that the proposed piezo-based non-dimensional hydration parameter is very effective in monitoring the early age hydration, as it has been derived from the refined structural impedance parameters, obtained by eliminating the PZT contribution, and using both the real and imaginary components of the admittance signature.

  4. Comparison of neurotoxicity of root canal sealers on spontaneous bioelectrical activity in identified Helix neurones using an intracellular recording technique.

    Science.gov (United States)

    Asgari, S; Janahmadi, M; Khalilkhani, H

    2003-12-01

    To evaluate the neurotoxic effects of two endodontic sealers, AH-26 and Roth 801, on firing excitability and action potential configuration of F1 neural cells in the suboesophageal ganglia of Helix aspersa. A conventional intracellular current clamp technique was used to study the blocking effects of AH-26 and Roth 801 on ionic currents underlying the action potential of F1 nerve cells. The sealers were prepared according to the manufacturers' directions and were applied to the bathing media in two ways: invasive (0.05 mL of total mixture of each sealer was applied at a distance of 3 mm from the cell), or gradual (0.05 mL of the extract of each dissolved mixture of sealers in normal Ringers solution was perfused). When applied in an invasive mode, both sealers reduced the duration, the amplitude of action potentials and the amplitude of after-hyperpolarization potentials significantly and led to dramatic changes in action potential configuration. In the gradual mode of application, AH-26 showed a biphasic action; it first increased the excitability and then decreased the action potential parameters, while Roth 801 exhibited solely blocking effects. Both sealers had significant inhibitory effects on excitability of F1 neuronal cells.

  5. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  6. ANALYSIS OF RELATIONS BETWEEN JUDO TECHNIQUES AND SPECIFIC MOTOR ABILITIES

    Directory of Open Access Journals (Sweden)

    Patrik Drid

    2006-06-01

    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  7. Techniques for the Statistical Analysis of Observer Data

    National Research Council Canada - National Science Library

    Bennett, John G

    2001-01-01

    .... The two techniques are as follows: (1) fitting logistic curves to the vehicle data, and (2) using the Fisher Exact Test to compare the probability of detection of the two vehicles at each range...

  8. IDENTIFYING THE ROLE OF NATIONAL DIGITAL CADASTRAL DATABASE (NDCDB IN MALAYSIA AND FOR LAND-BASED ANALYSIS

    Directory of Open Access Journals (Sweden)

    N. Z. A. Halim

    2017-10-01

    Full Text Available This paper explains the process carried out in identifying the significant role of NDCDB in Malaysia specifically in the land-based analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the role standpoint. Seven statements pertaining the significant role of NDCDB in Malaysia and land-based analysis were established after three rounds of consensus building. The agreed statements provided a clear definition to describe the important role of NDCDB in Malaysia and for land-based analysis, which was limitedly studied that lead to unclear perception to the general public and even the geospatial community. The connection of the statements with disaster management is discussed concisely at the end of the research.

  9. Analysis of neutron-reflectometry data by Monte Carlo technique

    CERN Document Server

    Singh, S

    2002-01-01

    Neutron-reflectometry data is collected in momentum space. The real-space information is extracted by fitting a model for the structure of a thin-film sample. We have attempted a Monte Carlo technique to extract the structure of the thin film. In this technique we change the structural parameters of the thin film by simulated annealing based on the Metropolis algorithm. (orig.)

  10. A Comparative Analysis of Machine Learning Techniques for Credit Scoring

    OpenAIRE

    Nwulu, Nnamdi; Oroja, Shola; İlkan, Mustafa

    2012-01-01

    Abstract Credit Scoring has become an oft researched topic in light of the increasing volatility of the global economy and the recent world financial crisis. Amidst the many methods used for credit scoring, machine learning techniques are becoming increasingly popular due to their efficient and accurate nature and relative simplicity. Furthermore machine learning techniques minimize the risk of human bias and error and maximize speed as they are able to perform computation...

  11. Quantitative Image Analysis Techniques with High-Speed Schlieren Photography

    Science.gov (United States)

    Pollard, Victoria J.; Herron, Andrew J.

    2017-01-01

    Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.

  12. Protein purification and analysis: next generation Western blotting techniques.

    Science.gov (United States)

    Mishra, Manish; Tiwari, Shuchita; Gomes, Aldrin V

    2017-11-01

    Western blotting is one of the most commonly used techniques in molecular biology and proteomics. Since western blotting is a multistep protocol, variations and errors can occur at any step reducing the reliability and reproducibility of this technique. Recent reports suggest that a few key steps, such as the sample preparation method, the amount and source of primary antibody used, as well as the normalization method utilized, are critical for reproducible western blot results. Areas covered: In this review, improvements in different areas of western blotting, including protein transfer and antibody validation, are summarized. The review discusses the most advanced western blotting techniques available and highlights the relationship between next generation western blotting techniques and its clinical relevance. Expert commentary: Over the last decade significant improvements have been made in creating more sensitive, automated, and advanced techniques by optimizing various aspects of the western blot protocol. New methods such as single cell-resolution western blot, capillary electrophoresis, DigiWest, automated microfluid western blotting and microchip electrophoresis have all been developed to reduce potential problems associated with the western blotting technique. Innovative developments in instrumentation and increased sensitivity for western blots offer novel possibilities for increasing the clinical implications of western blot.

  13. Flash fluorescence with indocyanine green videoangiography to identify the recipient artery for bypass with distal middle cerebral artery aneurysms: operative technique.

    Science.gov (United States)

    Rodríguez-Hernández, Ana; Lawton, Michael T

    2012-06-01

    Distal middle cerebral artery (MCA) aneurysms frequently have nonsaccular morphology that necessitates trapping and bypass. Bypasses can be difficult because efferent arteries lie deep in the opercular cleft and may not be easily identifiable. We introduce the "flash fluorescence" technique, which uses videoangiography with indocyanine green (ICG) dye to identify an appropriate recipient artery on the cortical surface for the bypass, enabling a more superficial and easier anastomosis. Flash fluorescence requires 3 steps: (1) temporary clip occlusion of the involved afferent artery; (2) videoangiography demonstrating fluorescence in uninvolved arteries on the cortical surface; and (3) removal of the temporary clip with flash fluorescence in the involved efferent arteries on the cortical surface, thereby identifying a recipient. Alternatively, temporary clips can occlude uninvolved arteries, and videoangiography will demonstrate initial fluorescence in efferent arteries during temporary occlusion and flash fluorescence in uninvolved arteries during reperfusion. From a consecutive series of 604 MCA aneurysms treated microsurgically, 22 (3.6%) were distal aneurysms and 11 required a bypass. The flash fluorescence technique was used in 3 patients to select the recipient artery for 2 superficial temporal artery-to-MCA bypasses and 1 MCA-MCA bypass. The correct recipient was selected in all cases. The flash fluorescence technique provides quick, reliable localization of an appropriate recipient artery for bypass when revascularization is needed for a distal MCA aneurysm. This technique eliminates the need for extensive dissection of the efferent artery and enables a superficial recipient site that makes the anastomosis safer, faster, and less demanding.

  14. Consensus of Leaders in Plastic Surgery: Identifying Procedural Competencies for Canadian Plastic Surgery Residency Training Using a Modified Delphi Technique.

    Science.gov (United States)

    Knox, Aaron D C; Shih, Jessica G; Warren, Richard J; Gilardino, Mirko S; Anastakis, Dimitri J

    2018-03-01

    Transitioning to competency-based surgical training will require consensus regarding the scope of plastic surgery and expectations of operative ability for graduating residents. Identifying surgical procedures experts deemed most important in preparing graduates for independent practice (i.e., "core" procedures), and those that are less important or deemed more appropriate for fellowship training (i.e., "noncore" procedures), will focus instructional and assessment efforts. Canadian plastic surgery program directors, the Canadian Society of Plastic Surgeons Executive Committee, and peer-nominated experts participated in an online, multiround, modified Delphi consensus exercise. Over three rounds, panelists were asked to sort 288 procedural competencies into five predetermined categories within core and noncore procedures, reflecting increasing expectations of ability. Eighty percent agreement was chosen to indicate consensus. Two hundred eighty-eight procedures spanning 13 domains were identified. Invitations were sent to 49 experts; 37 responded (75.5 percent), and 31 participated (83.8 percent of respondents). Procedures reaching 80 percent consensus increased from 101 (35 percent) during round 1, to 159 (55 percent) in round 2, and to 199 (69 percent) in round 3. The domain "burns" had the highest rate of agreement, whereas "lower extremity" had the lowest agreement. Final consensus categories included 154 core, essential; 23 core, nonessential; three noncore, experience; and 19 noncore, fellowship. This study provides clarity regarding which procedures plastic surgery experts deem most important for preparing graduates for independent practice. The list represents a snapshot of expert opinion regarding the current training environment. As our specialty grows and changes, this information will need to be periodically revisited.

  15. Gene expression meta-analysis identifies metastatic pathways and transcription factors in breast cancer

    International Nuclear Information System (INIS)

    Thomassen, Mads; Tan, Qihua; Kruse, Torben A

    2008-01-01

    Metastasis is believed to progress in several steps including different pathways but the determination and understanding of these mechanisms is still fragmentary. Microarray analysis of gene expression patterns in breast tumors has been used to predict outcome in recent studies. Besides classification of outcome, these global expression patterns may reflect biological mechanisms involved in metastasis of breast cancer. Our purpose has been to investigate pathways and transcription factors involved in metastasis by use of gene expression data sets. We have analyzed 8 publicly available gene expression data sets. A global approach, 'gene set enrichment analysis' as well as an approach focusing on a subset of significantly differently regulated genes, GenMAPP, has been applied to rank pathway gene sets according to differential regulation in metastasizing tumors compared to non-metastasizing tumors. Meta-analysis has been used to determine overrepresentation of pathways and transcription factors targets, concordant deregulated in metastasizing breast tumors, in several data sets. The major findings are up-regulation of cell cycle pathways and a metabolic shift towards glucose metabolism reflected in several pathways in metastasizing tumors. Growth factor pathways seem to play dual roles; EGF and PDGF pathways are decreased, while VEGF and sex-hormone pathways are increased in tumors that metastasize. Furthermore, migration, proteasome, immune system, angiogenesis, DNA repair and several signal transduction pathways are associated to metastasis. Finally several transcription factors e.g. E2F, NFY, and YY1 are identified as being involved in metastasis. By pathway meta-analysis many biological mechanisms beyond major characteristics such as proliferation are identified. Transcription factor analysis identifies a number of key factors that support central pathways. Several previously proposed treatment targets are identified and several new pathways that may

  16. Using Job Analysis Techniques to Understand Training Needs for Promotores de Salud.

    Science.gov (United States)

    Ospina, Javier H; Langford, Toshiko A; Henry, Kimberly L; Nelson, Tristan Q

    2018-04-01

    Despite the value of community health worker programs, such as Promotores de Salud, for addressing health disparities in the Latino community, little consensus has been reached to formally define the unique roles and duties associated with the job, thereby creating unique job training challenges. Understanding the job tasks and worker attributes central to this work is a critical first step for developing the training and evaluation systems of promotores programs. Here, we present the process and findings of a job analysis conducted for promotores working for Planned Parenthood. We employed a systematic approach, the combination job analysis method, to define the job in terms of its work and worker requirements, identifying key job tasks, as well as the worker attributes necessary to effectively perform them. Our results suggest that the promotores' job encompasses a broad range of activities and requires an equally broad range of personal characteristics to perform. These results played an important role in the development of our training and evaluation protocols. In this article, we introduce the technique of job analysis, provide an overview of the results from our own application of this technique, and discuss how these findings can be used to inform a training and performance evaluation system. This article provides a template for other organizations implementing similar community health worker programs and illustrates the value of conducting a job analysis for clarifying job roles, developing and evaluating job training materials, and selecting qualified job candidates.

  17. Identifying and Characterizing Discrepancies Between Test and Analysis Results of Compression-Loaded Panels

    Science.gov (United States)

    Thornburgh, Robert P.; Hilburger, Mark W.

    2005-01-01

    Results from a study to identify and characterize discrepancies between validation tests and high-fidelity analyses of compression-loaded panels are presented. First, potential sources of the discrepancies in both the experimental method and corresponding high-fidelity analysis models were identified. Then, a series of laboratory tests and numerical simulations were conducted to quantify the discrepancies and develop test and analysis methods to account for the discrepancies. The results indicate that the discrepancies between the validation tests and high-fidelity analyses can be attributed to imperfections in the test fixture and specimen geometry; test-fixture-induced changes in specimen geometry; and test-fixture-induced friction on the loaded edges of the test specimen. The results also show that accurate predictions of the panel response can be obtained when these specimen imperfections and edge conditions are accounted for in the analysis. The errors in the tests and analyses, and the methods used to characterize these errors are presented.

  18. Techniques for the Analysis of Protein-Protein Interactions in Vivo1[OPEN

    Science.gov (United States)

    Xing, Shuping; Wallmeroth, Niklas; Berendzen, Kenneth W.

    2016-01-01

    Identifying key players and their interactions is fundamental for understanding biochemical mechanisms at the molecular level. The ever-increasing number of alternative ways to detect protein-protein interactions (PPIs) speaks volumes about the creativity of scientists in hunting for the optimal technique. PPIs derived from single experiments or high-throughput screens enable the decoding of binary interactions, the building of large-scale interaction maps of single organisms, and the establishment of cross-species networks. This review provides a historical view of the development of PPI technology over the past three decades, particularly focusing on in vivo PPI techniques that are inexpensive to perform and/or easy to implement in a state-of-the-art molecular biology laboratory. Special emphasis is given to their feasibility and application for plant biology as well as recent improvements or additions to these established techniques. The biology behind each method and its advantages and disadvantages are discussed in detail, as are the design, execution, and evaluation of PPI analysis. We also aim to raise awareness about the technological considerations and the inherent flaws of these methods, which may have an impact on the biological interpretation of PPIs. Ultimately, we hope this review serves as a useful reference when choosing the most suitable PPI technique. PMID:27208310

  19. A cross-species genetic analysis identifies candidate genes for mouse anxiety and human bipolar disorder

    Directory of Open Access Journals (Sweden)

    David G Ashbrook

    2015-07-01

    Full Text Available Bipolar disorder (BD is a significant neuropsychiatric disorder with a lifetime prevalence of ~1%. To identify genetic variants underlying BD genome-wide association studies (GWAS have been carried out. While many variants of small effect associated with BD have been identified few have yet been confirmed, partly because of the low power of GWAS due to multiple comparisons being made. Complementary mapping studies using murine models have identified genetic variants for behavioral traits linked to BD, often with high power, but these identified regions often contain too many genes for clear identification of candidate genes. In the current study we have aligned human BD GWAS results and mouse linkage studies to help define and evaluate candidate genes linked to BD, seeking to use the power of the mouse mapping with the precision of GWAS. We use quantitative trait mapping for open field test and elevated zero maze data in the largest mammalian model system, the BXD recombinant inbred mouse population, to identify genomic regions associated with these BD-like phenotypes. We then investigate these regions in whole genome data from the Psychiatric Genomics Consortium’s bipolar disorder GWAS to identify candidate genes associated with BD. Finally we establish the biological relevance and pathways of these genes in a comprehensive systems genetics analysis.We identify four genes associated with both mouse anxiety and human BD. While TNR is a novel candidate for BD, we can confirm previously suggested associations with CMYA5, MCTP1 and RXRG. A cross-species, systems genetics analysis shows that MCTP1, RXRG and TNR coexpress with genes linked to psychiatric disorders and identify the striatum as a potential site of action. CMYA5, MCTP1, RXRG and TNR are associated with mouse anxiety and human BD. We hypothesize that MCTP1, RXRG and TNR influence intercellular signaling in the striatum.

  20. Wavelength resolved neutron transmission analysis to identify single crystal particles in historical metallurgy

    Science.gov (United States)

    Barzagli, E.; Grazzi, F.; Salvemini, F.; Scherillo, A.; Sato, H.; Shinohara, T.; Kamiyama, T.; Kiyanagi, Y.; Tremsin, A.; Zoppi, Marco

    2014-07-01

    The phase composition and the microstructure of four ferrous Japanese arrows of the Edo period (17th-19th century) has been determined through two complementary neutron techniques: Position-sensitive wavelength-resolved neutron transmission analysis (PS-WRNTA) and time-of-flight neutron diffraction (ToF-ND). Standard ToF-ND technique has been applied by using the INES diffractometer at the ISIS pulsed neutron source in the UK, while the innovative PS-WRNTA one has been performed at the J-PARC neutron source on the BL-10 NOBORU beam line using the high spatial high time resolution neutron imaging detector. With ToF-ND we were able to reach information about the quantitative distribution of the metal and non-metal phases, the texture level, the strain level and the domain size of each of the samples, which are important parameters to gain knowledge about the technological level of the Japanese weapon. Starting from this base of data, the more complex PS-WRNTA has been applied to the same samples. This experimental technique exploits the presence of the so-called Bragg edges, in the time-of-flight spectrum of neutrons transmitted through crystalline materials, to map the microstructural properties of samples. The two techniques are non-invasive and can be easily applied to archaeometry for an accurate microstructure mapping of metal and ceramic artifacts.

  1. Evaluation of nuclear reactor based activation analysis techniques

    International Nuclear Information System (INIS)

    Obrusnik, I.; Kucera, J.

    1977-09-01

    A survey is presented of the basic types of activation analysis applied in environmental control. Reactor neutron activation analysis is described (including the reactor as a neutron source, sample activation in the reactor, methodology of neutron activation analysis, sample transport into the reactor and sample packaging after irradiation, instrumental activation analysis with radiochemical separation, data measurement and evaluation, sampling and sample preparation). Sources of environmental contamination with trace elements, sampling and sample analysis by neutron activation are described. The analysis is described of soils, waters and biological materials. Methods are shown of evaluating neutron activation analysis results and of their interpretation for purposes of environmental control. (J.B.)

  2. Differentially expressed genes in pancreatic ductal adenocarcinomas identified through serial analysis of gene expression

    DEFF Research Database (Denmark)

    Hustinx, Steven R; Cao, Dengfeng; Maitra, Anirban

    2004-01-01

    genome and better biocomputational techniques have substantially improved the assignment of differentially expressed SAGE "tags" to human genes. These improvements have provided us with an opportunity to re-evaluate global gene expression in pancreatic cancer using existing SAGE libraries. SAGE libraries...... generated from six pancreatic cancers were compared to SAGE libraries generated from 11 non-neoplastic tissues. Compared to normal tissue libraries, we identified 453 SAGE tags as differentially expressed in pancreatic cancer, including 395 that mapped to known genes and 58 "uncharacterized" tags....... Of the 395 SAGE tags assigned to known genes, 223 were overexpressed in pancreatic cancer, and 172 were underexpressed. In order to map the 58 uncharacterized differentially expressed SAGE tags to genes, we used a newly developed resource called TAGmapper (http://tagmapper.ibioinformatics.org), to identify...

  3. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    OpenAIRE

    Akshay Amolik; Niketan Jivane; Mahavir Bhandari; Dr.M.Venkatesan

    2015-01-01

    Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment a...

  4. Trace elemental analysis of Indian natural moonstone gems by PIXE and XRD techniques

    International Nuclear Information System (INIS)

    Venkateswara Rao, R.; Venkateswarulu, P.; Kasipathi, C.; SivaJyothi, S.

    2013-01-01

    A selected number of Indian Eastern Ghats natural moonstone gems were studied with a powerful nuclear analytical and non-destructive Proton Induced X-ray Emission (PIXE) technique. Thirteen elements, including V, Co, Ni, Zn, Ga, Ba and Pb, were identified in these moonstones and may be useful in interpreting the various geochemical conditions and the probable cause of their inceptions in the moonstone gemstone matrix. Furthermore, preliminary XRD studies of different moonstone patterns were performed. The PIXE technique is a powerful method for quickly determining the elemental concentration of a substance. A 3 MeV proton beam was employed to excite the samples. The chemical constituents of moonstones from parts of the Eastern Ghats geological formations of Andhra Pradesh, India were determined, and gemological studies were performed on those gems. The crystal structure and the lattice parameters of the moonstones were estimated using X-Ray Diffraction studies, trace and minor elements were determined using the PIXE technique, and major compositional elements were confirmed by XRD. In the present work, the usefulness and versatility of the PIXE technique for research in geo-scientific methodology is established. - Highlights: • For the first time, PIXE technique was employed to analyze the East Indian natural moonstone gems. • The trace and minor elements are estimated using PIXE technique whereas major compositional elements are confirmed by XRD. • Adularia variety of moonstone is found to be abundant in the present study. • The PIXE analysis concludes that Eastern Ghats of India are rich not only in gemstones but also in trace elements

  5. Analysis of Nature of Science Included in Recent Popular Writing Using Text Mining Techniques

    Science.gov (United States)

    Jiang, Feng; McComas, William F.

    2014-09-01

    This study examined the inclusion of nature of science (NOS) in popular science writing to determine whether it could serve supplementary resource for teaching NOS and to evaluate the accuracy of text mining and classification as a viable research tool in science education research. Four groups of documents published from 2001 to 2010 were analyzed: Scientific American, Discover magazine, winners of the Royal Society Winton Prize for Science Books, and books from NSTA's list of Outstanding Science Trade Books. Computer analysis categorized passages in the selected documents based on their inclusions of NOS. Human analysis assessed the frequency, context, coverage, and accuracy of the inclusions of NOS within computer identified NOS passages. NOS was rarely addressed in selected document sets but somewhat more frequently addressed in the letters section of the two magazines. This result suggests that readers seem interested in the discussion of NOS-related themes. In the popular science books analyzed, NOS presentations were found more likely to be aggregated in the beginning and the end of the book, rather than scattered throughout. The most commonly addressed NOS elements in the analyzed documents are science and society and empiricism in science. Only one inaccurate presentation of NOS were identified in all analyzed documents. The text mining technique demonstrated exciting performance, which invites more applications of the technique to analyze other aspects of science textbooks, popular science writing, or other materials involved in science teaching and learning.

  6. Analysis of the archaeological pieces with the PIXE technique; Analisis de piezas arqueologicas con la tecnica PIXE

    Energy Technology Data Exchange (ETDEWEB)

    Tenorio C, D. [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    2005-07-01

    The handling of physical-nuclear techniques for the analysis of archaeological pieces it was carried out by first time in 1956 by the doctors Oppenheimer and Dodsom (in the United States); who wrote about the wide utilities of the neutron activation analysis (NAA), this technique requires of a nuclear reactor that could be considered like one a factory of thermal neutrons necessary to carry out this analysis. The first experiments in that were applied the NAA were on ceramic coming from the Mediterranean, in those that the relationships of the elements of sodium and manganese were determined to know their elementary composition and to identify the origin of their manufacture. Later on other study techniques were applied in archaeological materials, as the X-ray fluorescence, Moessbauer spectroscopy, scanning electron microscopy, PIXE analysis (particle induced x-ray emission), among other, this last is the matter that we will present in this work. (Author)

  7. Identifying constituents in commercial gasoline using Fourier transform-infrared spectroscopy and independent component analysis.

    Science.gov (United States)

    Pasadakis, Nikos; Kardamakis, Andreas A

    2006-09-25

    A new method is proposed that enables the identification of five refinery fractions present in commercial gasoline mixtures using infrared spectroscopic analysis. The data analysis and interpretation was carried out based on independent component analysis (ICA) and spectral similarity techniques. The FT-IR spectra of the gasoline constituents were determined using the ICA method, exclusively based on the spectra of their mixtures as a blind separation procedure, i.e. assuming unknown the spectra of the constituents. The identity of the constituents was subsequently determined using similarity measures commonly employed in spectra library searches against the spectra of the constituent components. The high correlation scores that were obtained in the identification of the constituents indicates that the developed method can be employed as a rapid and effective tool in quality control, fingerprinting or forensic applications, where gasoline constituents are suspected.

  8. Performance analysis of two-way DF relay selection techniques

    Directory of Open Access Journals (Sweden)

    Samer Alabed

    2016-09-01

    Full Text Available This work proposes novel bi-directional dual-relay selection techniques based on Alamouti space-time block coding (STBC using the decode and forward (DF protocol and analyzes their performance. In the proposed techniques, two- and the three-phase relaying schemes are used to perform bi-directional communication between the communicating terminals via two selected single-antenna relays that employ the Alamouti STBC in a distributed fashion to achieve diversity and orthogonalization of the channels and hence improve the reliability of the system and enable the use of a symbol-wise detector. Furthermore, the network coding strategy applied at all relays is not associated with any power wastage for broadcasting data already known at any terminal, resulting in improved overall performance at the terminals. Our simulations confirm the analytical results and show a substantially improved bit error rate (BER performance of our proposed techniques compared with the current state of the art.

  9. Radon remedial techniques in buildings - analysis of French actual cases

    International Nuclear Information System (INIS)

    Dupuis, M.

    2004-01-01

    The IRSN has compiled a collection of solutions from data provided by the various decentralised government services in 31 French departments. Contributors were asked to provide a description of the building, as well as details of measured radon levels, the type of reduction technique adopted and the cost. Illustrative layouts, technical drawings and photographs were also requested, when available. Of the cases recorded, 85% are establishments open to the public (schools (70%), city halls (4%) and combined city halls and school houses (26%)), 11% are houses and 4% industrial buildings. IRSN obtained 27 real cases of remedial techniques used. The data were presented in the form of fact sheets. The primary aim of this exercise was to illustrate each of the radon reduction techniques that can be used in the different building types (with basement, ground bearing slab, crawl space). This investigation not only enabled us to show that combining passive and active techniques reduces the operating cost of the installation, but above all that it considerably improves the efficiency. The passive technique reduces the amount of radon in the building and thus reduces the necessary ventilation rate, which directly affects the cost of operating the installation. For the 27 cases recorded, we noted:(a) the application of 7 passive techniques: sealing of floors and semi-buried walls, together with improved aeration by installing ventilation openings or ventilation strips in the windows. Radon concentrations were reduced on average by a factor of 4.7. No measurement in excess of 400 Bq.m -3 (the limit recommended by the French public authorities) was obtained following completion of the works; (b) the application of 15 active techniques: depressurization of the underlying ground, crawl space or basement and/or pressurization of the building. Radon concentrations were reduced on average by a factor of 13.8. Radon concentrations of over 400 Bq.m -3 were measured in only 4 cases

  10. Using Latent Semantic Analysis to Identify Research Trends in OpenStreetMap

    Directory of Open Access Journals (Sweden)

    Sukhjit Singh Sehra

    2017-07-01

    Full Text Available OpenStreetMap (OSM, based on collaborative mapping, has become a subject of great interest to the academic community, resulting in a considerable body of literature produced by many researchers. In this paper, we use Latent Semantic Analysis (LSA to help identify the emerging research trends in OSM. An extensive corpus of 485 academic abstracts of papers published during the period 2007–2016 was used. Five core research areas and fifty research trends were identified in this study. In addition, potential future research directions have been provided to aid geospatial information scientists, technologists and researchers in undertaking future OSM research.

  11. Genetic programming system for building block analysis to enhance data analysis and data mining techniques

    Science.gov (United States)

    Eick, Christoph F.; Sanz, Walter D.; Zhang, Ruijian

    1999-02-01

    Recently, many computerized data mining tools and environments have been proposed for finding interesting patterns in large data collections. These tools employ techniques that originate from research in various areas, such as machine learning, statistical data analysis, and visualization. Each of these techniques makes assumptions concerning the composition of the data collection to be analyzed. If the particular data collection does not meet these assumptions well, the technique usually performs poorly. For example, decision tree tools, such as C4.5, rely on rectangular approximations, which do not perform well if the boundaries between different classes have other shapes, such as a 45 degree line or elliptical shapes. However, if we could find a transformation f that transforms the original attribute space, in which class boundaries are more, better rectangular approximations could be obtained. In this paper, we address the problem of finding such transformations f. We describe the features of the tool, WOLS, whose goal is the discovery of ingredients for such transformation functions f, which we call building blocks. The tool employs genetic programming and symbolic regression for this purpose. We also present and discuss the results of case studies, using the building block analysis tool, in the areas of decision tree learning and regression analysis.

  12. Geotechnical Analysis of Paleoseismic Shaking Using Liquefaction Features: Part I. Major Updating of Analysis Techniques

    Science.gov (United States)

    Olson, Scott M.; Green, Russell A.; Obermeier, Stephen F.

    2003-01-01

    A new methodology is proposed for the geotechnical analysis of strength of paleoseismic shaking using liquefaction effects. The proposed method provides recommendations for selection of both individual and regionally located test sites, techniques for validation of field data for use in back-analysis, and use of a recently developed energy-based solution to back-calculate paleoearthquake magnitude and strength of shaking. The proposed method allows investigators to assess the influence of post-earthquake density change and aging. The proposed method also describes how the back-calculations from individual sites should be integrated into a regional assessment of paleoseismic parameters.

  13. Current trends in nuclear borehole logging techniques for elemental analysis

    International Nuclear Information System (INIS)

    1988-06-01

    This report is the result of a consultants' meeting organized by the IAEA and held in Ottawa, Canada, 2-6 November 1987 in order to assess the present technical status of nuclear borehole logging techniques, to find out the well established applications and the development trends. It contains a summary report giving a comprehensive overview of the techniques and applications and a collection of research papers describing work done in industrial institutes. A separate abstract was prepared for each of these 9 papers. Refs, figs and tabs

  14. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  15. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  16. Research citation analysis of nursing academics in Canada: identifying success indicators.

    Science.gov (United States)

    Hack, Thomas F; Crooks, Dauna; Plohman, James; Kepron, Emma

    2010-11-01

    This article is a report of a citation analysis of research publications by Canadian nursing academics. Citation analysis can yield objective criteria for assessing the value of published research and is becoming increasingly popular as an academic evaluation tool in universities around the world. Citation analysis is useful for examining the research performance of academic researchers and identifying leaders among them. The journal publication records of 737 nursing academics at 33 Canadian universities and schools of nursing were subject to citation analysis using the Scopus database. Three primary types of analysis were performed for each individual: number of citations for each journal publication, summative citation count of all published papers and the Scopus h-index. Preliminary citation analysis was conducted from June to July 2009, with the final analysis performed on 2 October 2009 following e-mail verification of publication lists. The top 20 nursing academics for each of five citation categories are presented: the number of career citations for all publications, number of career citations for first-authored publications, most highly cited first-authored publications, the Scopus h-index for all publications and the Scopus h-index for first-authored publications. Citation analysis metrics are useful for evaluating the research performance of academic researchers in nursing. Institutions are encouraged to protect the research time of successful and promising nursing academics, and to dedicate funds to enhance the research programmes of underperforming academic nursing groups. © 2010 The Authors. Journal of Advanced Nursing © 2010 Blackwell Publishing Ltd.

  17. Word frequency and content analysis approach to identify demand patterns in a virtual community of carriers of hepatitis C.

    Science.gov (United States)

    Vasconcellos-Silva, Paulo Roberto; Carvalho, Darlinton; Lucena, Carlos

    2013-07-04

    Orkut, a Brazilian virtual social network, is responsible for popularization of the Internet among people of low income and educational level. It's observed that rapid growth of virtual communities can be reached by low cost Internet access in community local area network houses. Orkut poses an important social resource for Brazilian patients with chronic conditions like hepatitis C virus (HCV) carriers, who face several obstacles in adapting to everyday difficulties. Identify Patterns of Recurring Demands (PRD) expressed in messages posted by members of virtual communities dedicated to HCV carriers. Pre-selection: we identified terms commonly associated to HCV on generic Internet searches (primary Keywords - Kps); Kps were used to identify the most representative HCV communities in a virtual community site (Orkut); all messages published along 8 years on all topics of the community were collected and tabulated; the word frequency was used to construct a "word cloud" (graphic representation of the word frequency) on which was applied a content analysis technique. The most cited terms expressed: search for information about medications (prescribed and "forbidden"); emphasis on counting time, which were interpreted as surviving expectations; frequent mention of God, doctors, and "husbands" (female carriers were 68%). These elements provided material for further research - they will be useful in the construction of categories in discourse analysis. The present work is a disclosure of preliminary findings considered original and promising. The word frequency/content analysis approach expressed needs of social support and material assistance that may provide subsidies for further qualitative approach and public health policies aimed to HCV carriers. The study of PRD by word frequency may be useful in identifying demands underestimated by other means.

  18. Experimental Analysis of Temperature Differences During Implant Site Preparation: Continuous Drilling Technique Versus Intermittent Drilling Technique.

    Science.gov (United States)

    Di Fiore, Adolfo; Sivolella, Stefano; Stocco, Elena; Favero, Vittorio; Stellini, Edoardo

    2018-02-01

    Implant site preparation through drilling procedures may cause bone thermonecrosis. The aim of this in vitro study was to evaluate, using a thermal probe, overheating at implant sites during osteotomies through 2 different drilling methods (continuous drilling technique versus intermittent drilling technique) using irrigation at different temperatures. Five implant sites 13 mm in length were performed on 16 blocks (fresh bovine ribs), for a total of 80 implant sites. The PT-100 thermal probe was positioned 5 mm from each site. Two physiological refrigerant solutions were used: one at 23.7°C and one at 6.0°C. Four experimental groups were considered: group A (continuous drilling with physiological solution at 23.7°C), group B (intermittent drilling with physiological solution at 23.7°C), group C (continuous drilling with physiological solution at 6.0°C), and group D (intermittent drilling with physiological solution at 6.0°C). The Wilcoxon rank-sum test (2-tailed) was used to compare groups. While there was no difference between group A and group B (W = 86; P = .45), statistically significant differences were observed between experimental groups A and C (W = 0; P =.0001), B and D (W = 45; P =.0005), and C and D (W = 41; P = .003). Implant site preparation did not affect the overheating of the bone. Statistically significant differences were found with the refrigerant solutions. Using both irrigating solutions, bone temperature did not exceed 47°C.

  19. Critical analysis of procurement techniques in construction management sectors

    Science.gov (United States)

    Tiwari, Suman Tiwari Suresh; Chan, Shiau Wei; Faraz Mubarak, Muhammad

    2018-04-01

    Over the last three decades, numerous procurement techniques have been one of the highlights of the Construction Management (CM) for ventures, administration contracting, venture management as well as design and construct. Due to the development and utilization of those techniques, various researchers have explored the criteria for their choice and their execution in terms of time, cost and quality. Nevertheless, there is a lack of giving an account on the relationship between the procurement techniques and the progressed related issues, for example, supply chain, sustainability, innovation and technology development, lean construction, constructability, value management, Building Information Modelling (BIM) as well as e-procurement. Through chosen papers from the reputable CM-related academic journals, the specified scopes of these issues are methodically assessed with the objective to explore the status and trend in procurement related research. The result of this paper contributes theoretically as well as practically to the researchers and industrialist in order to be aware and appreciate the development of procurement techniques.

  20. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  1. An analysis of batting backlift techniques among coached and ...

    African Journals Online (AJOL)

    One of the first principles of cricket batsmanship for batting coaches is to teach junior cricketers to play using a straight bat. This requires the bat to be lifted directly towards the stumps with the bat face facing downwards. No study has yet examined whether there are differences in the batting back lift techniques (BTT) of ...

  2. Multidimensional scaling technique for analysis of magnetic storms ...

    Indian Academy of Sciences (India)

    R.Narasimhan(krishtel emaging) 1461 1996 Oct 15 13:05:22

    the amplitude of H decreases progressively with increasing latitudes at the Indian chain of observa- tories (Rastogi et al 1997). The aim of this study is to apply the method of multidimensional scal- ing technique to examine the accuracy of results in comparison with the conventional method of cor- relation coefficients in the ...

  3. Protease analysis by zymography: a review on techniques and patents.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2009-01-01

    Zymography, the detection of enzymatic activity on gel electrophoresis, has been a technique described in the literature for at least in the past 50 years. Although a diverse amount of enzymes, especially proteases, have been detected, advances and improvements have been slower in comparison with other molecular biology, biotechnology and chromatography techniques. Most of the reviews and patents published focus on the technique as an element for enzymatic testing, but detailed analytical studies are scarce. Patents referring to zymography per se are few and the technique itself is hardly an important issue in titles or keywords in many scientific publications. This review covers a small condensation of the works published so far dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations like 2-D zymography, real-time zymography, and in-situ zymography. Moreover, a scope will be given to visualize the new tendencies of this method, regarding substrates used and activity visualization. What to expect from zymography in the near future is also approached.

  4. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four aque...

  5. Sixth Australian conference on nuclear techniques of analysis: proceedings

    International Nuclear Information System (INIS)

    1989-01-01

    These proceedings contain the abstracts of 77 lectures. The topics focus on instrumentation, nuclear techniques and their applications for material science, surfaces, archaeometry, art, geological, environmental and biomedical studies. An outline of the Australian facilities available for research purposes is also provided. Separate abstracts were prepared for the individual papers in this volume

  6. Tape Stripping Technique for Stratum Corneum Protein Analysis

    DEFF Research Database (Denmark)

    Clausen, Maja-Lisa; Slotved, H.-C.; Krogfelt, Karen Angeliki

    2016-01-01

    The aim of this study was to investigate the amount of protein in stratum corneum in atopic dermatitis (AD) patients and healthy controls, using tape stripping technique. Furthermore, to compare two different methods for protein assessment. Tape stripping was performed in AD patients and healthy ...

  7. Alternative Colposcopy Techniques: A Systematic Review and Meta-analysis

    NARCIS (Netherlands)

    Hermens, M.; Ebisch, R.M.F.; Galaal, K.; Bekkers, R.L.M.

    2016-01-01

    OBJECTIVE: To assess the diagnostic value of alternative (digital) colposcopy techniques for detection of cervical intraepithelial neoplasia (CIN) 2 or worse in a colposcopy population. DATA SOURCES: MEDLINE, EMBASE, ClinicalTrials.gov, and the Cochrane Library were searched from inception up to

  8. Comparative analysis of methods for identifying multimorbidity patterns: a study of 'real-world' data.

    Science.gov (United States)

    Roso-Llorach, Albert; Violán, Concepción; Foguet-Boreu, Quintí; Rodriguez-Blanco, Teresa; Pons-Vigués, Mariona; Pujol-Ribera, Enriqueta; Valderas, Jose Maria

    2018-03-22

    The aim was to compare multimorbidity patterns identified with the two most commonly used methods: hierarchical cluster analysis (HCA) and exploratory factor analysis (EFA) in a large primary care database. Specific objectives were: (1) to determine whether choice of method affects the composition of these patterns and (2) to consider the potential application of each method in the clinical setting. Cross-sectional study. Diagnoses were based on the 263 corresponding blocks of the International Classification of Diseases version 10. Multimorbidity patterns were identified using HCA and EFA. Analysis was stratified by sex, and results compared for each method. Electronic health records for 408 994 patients with multimorbidity aged 45-64 years in 274 primary health care teams from 2010 in Catalonia, Spain. HCA identified 53 clusters for women, with just 12 clusters including at least 2 diagnoses, and 15 clusters for men, all of them including at least two diagnoses. EFA showed 9 factors for women and 10 factors for men. We observed differences by sex and method of analysis, although some patterns were consistent. Three combinations of diseases were observed consistently across sex groups and across both methods: hypertension and obesity, spondylopathies and deforming dorsopathies, and dermatitis eczema and mycosis. This study showed that multimorbidity patterns vary depending on the method of analysis used (HCA vs EFA) and provided new evidence about the known limitations of attempts to compare multimorbidity patterns in real-world data studies. We found that EFA was useful in describing comorbidity relationships and HCA could be useful for in-depth study of multimorbidity. Our results suggest possible applications for each of these methods in clinical and research settings, and add information about some aspects that must be considered in standardisation of future studies: spectrum of diseases, data usage and methods of analysis. © Article author(s) (or their

  9. VLBI FOR GRAVITY PROBE B. IV. A NEW ASTROMETRIC ANALYSIS TECHNIQUE AND A COMPARISON WITH RESULTS FROM OTHER TECHNIQUES

    International Nuclear Information System (INIS)

    Lebach, D. E.; Ratner, M. I.; Shapiro, I. I.; Bartel, N.; Bietenholz, M. F.; Lederman, J. I.; Ransom, R. R.; Campbell, R. M.; Gordon, D.; Lestrade, J.-F.

    2012-01-01

    When very long baseline interferometry (VLBI) observations are used to determine the position or motion of a radio source relative to reference sources nearby on the sky, the astrometric information is usually obtained via (1) phase-referenced maps or (2) parametric model fits to measured fringe phases or multiband delays. In this paper, we describe a 'merged' analysis technique which combines some of the most important advantages of these other two approaches. In particular, our merged technique combines the superior model-correction capabilities of parametric model fits with the ability of phase-referenced maps to yield astrometric measurements of sources that are too weak to be used in parametric model fits. We compare the results from this merged technique with the results from phase-referenced maps and from parametric model fits in the analysis of astrometric VLBI observations of the radio-bright star IM Pegasi (HR 8703) and the radio source B2252+172 nearby on the sky. In these studies we use central-core components of radio sources 3C 454.3 and B2250+194 as our positional references. We obtain astrometric results for IM Peg with our merged technique even when the source is too weak to be used in parametric model fits, and we find that our merged technique yields astrometric results superior to the phase-referenced mapping technique. We used our merged technique to estimate the proper motion and other astrometric parameters of IM Peg in support of the NASA/Stanford Gravity Probe B mission.

  10. COMBINED GEOPHYSICAL INVESTIGATION TECHNIQUES TO IDENTIFY BURIED WASTE IN AN UNCONTROLLED LANDFILL AT THE PADUCAH GASEOUS DIFFUSION PLANT, KENTUCKY

    International Nuclear Information System (INIS)

    Miller, Peter T.; Starmer, R. John

    2003-01-01

    survey used a 200 megahertz (MHz) antenna to provide the maximum depth penetration and subsurface detail yielding usable signals to a depth of about 6 to 10 feet in this environment and allowed discrimination of objects that were deeper, particularly useful in the southern area of the site where shallow depth metallic debris (primarily roof flashing) complicated interpretation of the EM and magnetic data. Several geophysical anomalies were defined on the contour plots that indicated the presence of buried metal. During the first phase of the project, nine anomalies or anomalous areas were detected. The sizes, shapes, and magnitudes of the anomalies varied considerably, but given the anticipated size of the primary target of the investigation, only the most prominent anomalies were considered as potential caches of 30 to 60 buried drums. After completion of a second phase investigation, only two of the anomalies were of sufficient magnitude, not identifiable with existing known metallic objects such as monitoring wells, and in positions that corresponded to the location of alleged dumping activities and were recommended for further, intrusive investigation. Other important findings, based on the variable frequency EM method and its combination with total field magnetic and GPR data, included the confirmation of the position of the old NSDD, the ability to differentiate between ferrous and non-ferrous anomalies, and the detection of what may be plumes emanating from the landfill cell

  11. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-03-22

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018. Published by Elsevier Inc.

  12. Patent Network Analysis and Quadratic Assignment Procedures to Identify the Convergence of Robot Technologies.

    Directory of Open Access Journals (Sweden)

    Woo Jin Lee

    Full Text Available Because of the remarkable developments in robotics in recent years, technological convergence has been active in this area. We focused on finding patterns of convergence within robot technology using network analysis of patents in both the USPTO and KIPO. To identify the variables that affect convergence, we used quadratic assignment procedures (QAP. From our analysis, we observed the patent network ecology related to convergence and found technologies that have great potential to converge with other robotics technologies. The results of our study are expected to contribute to setting up convergence based R&D policies for robotics, which can lead new innovation.

  13. Parameter estimation for multistage clonal expansion models from cancer incidence data: A practical identifiability analysis.

    Science.gov (United States)

    Brouwer, Andrew F; Meza, Rafael; Eisenberg, Marisa C

    2017-03-01

    Many cancers are understood to be the product of multiple somatic mutations or other rate-limiting events. Multistage clonal expansion (MSCE) models are a class of continuous-time Markov chain models that capture the multi-hit initiation-promotion-malignant-conversion hypothesis of carcinogenesis. These models have been used broadly to investigate the epidemiology of many cancers, assess the impact of carcinogen exposures on cancer risk, and evaluate the potential impact of cancer prevention and control strategies on cancer rates. Structural identifiability (the analysis of the maximum parametric information available for a model given perfectly measured data) of certain MSCE models has been previously investigated. However, structural identifiability is a theoretical property and does not address the limitations of real data. In this study, we use pancreatic cancer as a case study to examine the practical identifiability of the two-, three-, and four-stage clonal expansion models given age-specific cancer incidence data using a numerical profile-likelihood approach. We demonstrate that, in the case of the three- and four-stage models, several parameters that are theoretically structurally identifiable, are, in practice, unidentifiable. This result means that key parameters such as the intermediate cell mutation rates are not individually identifiable from the data and that estimation of those parameters, even if structurally identifiable, will not be stable. We also show that products of these practically unidentifiable parameters are practically identifiable, and, based on this, we propose new reparameterizations of the model hazards that resolve the parameter estimation problems. Our results highlight the importance of identifiability to the interpretation of model parameter estimates.

  14. Twelve type 2 diabetes susceptibility loci identified through large-scale association analysis

    DEFF Research Database (Denmark)

    Voight, Benjamin F; Scott, Laura J; Steinthorsdottir, Valgerdur

    2010-01-01

    By combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals with combi......By combining genome-wide association data from 8,130 individuals with type 2 diabetes (T2D) and 38,987 controls of European descent and following up previously unidentified meta-analysis signals in a further 34,412 cases and 59,925 controls, we identified 12 new T2D association signals...

  15. Models, Web-Based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance

    National Research Council Canada - National Science Library

    Hill, Raymond

    2001-01-01

    ... Laboratory, Logistics Research Division, Logistics Readiness Branch to propose a research agenda entitled, "Models, Web-based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance...

  16. A method for identifying compromised clients based on DNS traffic analysis

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup; D’Alconzo, Alessandro

    2017-01-01

    based on DNS traffic analysis. The proposed method identifies suspicious agile DNS mappings, i.e., mappings characterized by fast changing domain names or/and IP addresses, often used by malicious services. The approach discovers clients that have queried domains contained within identified suspicious...... domain-to-IP mappings, thus assisting in pinpointing potentially compromised clients within the network. The proposed approach targets compromised clients in large-scale operational networks. We have evaluated the proposed approach using an extensive set of DNS traffic traces from different operational...... ISP networks. The evaluation illustrates a great potential of accurately identifying suspicious domain-to-IP mappings and potentially compromised clients. Furthermore, the achieved performance indicate that the novel detection approach is promising in view of the adoption in operational ISP networks...

  17. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist

    2015-01-01

    The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......) identifiability problems. By using the presented approach it is possible to detect potential identifiability problems and avoid pointless calibration (and experimental!) effort....

  18. An analysis of endothelial microparticles as a function of cell surface antibodies and centrifugation techniques.

    Science.gov (United States)

    Venable, Adam S; Williams, Randall R; Haviland, David L; McFarlin, Brian K

    2014-04-01

    Chronic vascular disease is partially characterized by the presence of lesions along the vascular endothelial wall. Current FDA-approved clinical techniques lack the ability to measure very early changes in endothelial cell health. When endothelial cells are damaged, they release endothelial microparticles (EMPs) into circulation. Thus, blood EMP concentration may represent a useful cardiovascular disease biomarker. Despite the potential value of EMPs, current flow cytometry techniques may not consistently distinguish EMPs from other small cell particles. The purpose of this study was to use imaging flow cytometry to modify existing methods of identifying EMPs based on cell-surface receptor expression and visual morphology. Platelet poor plasma (PPP) was isolated using four different techniques, each utilizing a two-step serial centrifugation process. The cell-surface markers used in this study were selected based on those that are commonly reported in the literature. PPP (100μL) was labeled with CD31, CD42a, CD45, CD51, CD66b, and CD144 for 30-min in dark on ice. Based on replicated experiments, EMPs were best identified by cell-surface CD144 expression relative to other commonly reported EMP markers (CD31 & CD51). It is important to note that contaminating LMPs, GMPs, and PMPs were thought to be removed in the preparation of PPP. However, upon analysis of prepared samples staining CD31 against CD51 revealed a double-positive population that was less than 1% EMPs. In contrast, when using CD144 to identify EMPs, ~87% of observed particles were free of contaminating microparticles. Using a counterstain of CD42a, this purity can be improved to over 99%. More research is needed to understand how our improved EMP measurement method can be used in experimental models measuring acute vascular responses or chronic vascular diseases. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Effective self-regulation change techniques to promote mental wellbeing among adolescents: a meta-analysis.

    Science.gov (United States)

    van Genugten, Lenneke; Dusseldorp, Elise; Massey, Emma K; van Empelen, Pepijn

    2017-03-01

    Mental wellbeing is influenced by self-regulation processes. However, little is known on the efficacy of change techniques based on self-regulation to promote mental wellbeing. The aim of this meta-analysis is to identify effective self-regulation techniques (SRTs) in primary and secondary prevention interventions on mental wellbeing in adolescents. Forty interventions were included in the analyses. Techniques were coded into nine categories of SRTs. Meta-analyses were conducted to identify the effectiveness of SRTs, examining three different outcomes: internalising behaviour, externalising behaviour, and self-esteem. Primary interventions had a small-to-medium ([Formula: see text] = 0.16-0.29) on self-esteem and internalising behaviour. Secondary interventions had a medium-to-large short-term effect (average [Formula: see text] = 0.56) on internalising behaviour and self-esteem. In secondary interventions, interventions including asking for social support [Formula: see text] 95% confidence interval, CI = 1.11-1.98) had a great effect on internalising behaviour. Interventions including monitoring and evaluation had a greater effect on self-esteem [Formula: see text] 95% CI = 0.21-0.57). For primary interventions, there was not a single SRT that was associated with a greater intervention effect on internalising behaviour or self-esteem. No effects were found for externalising behaviours. Self-regulation interventions are moderately effective at improving mental wellbeing among adolescents. Secondary interventions promoting 'asking for social support' and promoting 'monitoring and evaluation' were associated with improved outcomes. More research is needed to identify other SRTs or combinations of SRTs that could improve understanding or optimise mental wellbeing interventions.

  20. The Class Audit: A Technique for Instructional Analysis

    Science.gov (United States)

    King, William L.

    1978-01-01

    A media specialist observes a class for a semester to identify areas where audiovisual materials might enhance the presentation, and meets with the instructor to discuss potential applications of such materials and plan for the production of specialized items. (STS)