WorldWideScience

Sample records for analysis techniques applied

  1. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  2. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  3. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  4. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  6. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  7. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  8. Applied ALARA techniques

    International Nuclear Information System (INIS)

    Waggoner, L.O.

    1998-01-01

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work

  9. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  10. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  11. Spatial analysis techniques applied to uranium prospecting in Chihuahua State, Mexico

    Science.gov (United States)

    Hinojosa de la Garza, Octavio R.; Montero Cabrera, María Elena; Sanín, Luz H.; Reyes Cortés, Manuel; Martínez Meyer, Enrique

    2014-07-01

    To estimate the distribution of uranium minerals in Chihuahua, the advanced statistical model "Maximun Entropy Method" (MaxEnt) was applied. A distinguishing feature of this method is that it can fit more complex models in case of small datasets (x and y data), as is the location of uranium ores in the State of Chihuahua. For georeferencing uranium ores, a database from the United States Geological Survey and workgroup of experts in Mexico was used. The main contribution of this paper is the proposal of maximum entropy techniques to obtain the mineral's potential distribution. For this model were used 24 environmental layers like topography, gravimetry, climate (worldclim), soil properties and others that were useful to project the uranium's distribution across the study area. For the validation of the places predicted by the model, comparisons were done with other research of the Mexican Service of Geological Survey, with direct exploration of specific areas and by talks with former exploration workers of the enterprise "Uranio de Mexico". Results. New uranium areas predicted by the model were validated, finding some relationship between the model predictions and geological faults. Conclusions. Modeling by spatial analysis provides additional information to the energy and mineral resources sectors.

  12. Applied analysis

    CERN Document Server

    Lanczos, Cornelius

    1956-01-01

    Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.

  13. Advanced gamma spectrum processing technique applied to the analysis of scattering spectra for determining material thickness

    International Nuclear Information System (INIS)

    Hoang Duc Tam; VNUHCM-University of Science, Ho Chi Minh City; Huynh Dinh Chuong; Tran Thien Thanh; Vo Hoang Nguyen; Hoang Thi Kieu Trang; Chau Van Tao

    2015-01-01

    In this work, an advanced gamma spectrum processing technique is applied to analyze experimental scattering spectra for determining the thickness of C45 heat-resistant steel plates. The single scattering peak of scattering spectra is taken as an advantage to measure the intensity of single scattering photons. Based on these results, the thickness of steel plates is determined with a maximum deviation of real thickness and measured thickness of about 4 %. Monte Carlo simulation using MCNP5 code is also performed to cross check the results, which yields a maximum deviation of 2 %. These results strongly confirm the capability of this technique in analyzing gamma scattering spectra, which is a simple, effective and convenient method for determining material thickness. (author)

  14. Sensitivity analysis techniques applied to a system of hyperbolic conservation laws

    International Nuclear Information System (INIS)

    Weirs, V. Gregory; Kamm, James R.; Swiler, Laura P.; Tarantola, Stefano; Ratto, Marco; Adams, Brian M.; Rider, William J.; Eldred, Michael S.

    2012-01-01

    Sensitivity analysis is comprised of techniques to quantify the effects of the input variables on a set of outputs. In particular, sensitivity indices can be used to infer which input parameters most significantly affect the results of a computational model. With continually increasing computing power, sensitivity analysis has become an important technique by which to understand the behavior of large-scale computer simulations. Many sensitivity analysis methods rely on sampling from distributions of the inputs. Such sampling-based methods can be computationally expensive, requiring many evaluations of the simulation; in this case, the Sobol' method provides an easy and accurate way to compute variance-based measures, provided a sufficient number of model evaluations are available. As an alternative, meta-modeling approaches have been devised to approximate the response surface and estimate various measures of sensitivity. In this work, we consider a variety of sensitivity analysis methods, including different sampling strategies, different meta-models, and different ways of evaluating variance-based sensitivity indices. The problem we consider is the 1-D Riemann problem. By a careful choice of inputs, discontinuous solutions are obtained, leading to discontinuous response surfaces; such surfaces can be particularly problematic for meta-modeling approaches. The goal of this study is to compare the estimated sensitivity indices with exact values and to evaluate the convergence of these estimates with increasing samples sizes and under an increasing number of meta-model evaluations. - Highlights: ► Sensitivity analysis techniques for a model shock physics problem are compared. ► The model problem and the sensitivity analysis problem have exact solutions. ► Subtle details of the method for computing sensitivity indices can affect the results.

  15. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  16. Applied Focused Ion Beam Techniques for Sample Preparation of Astromaterials for Integrated Nano-Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Graham, G A; Teslich, N E; Kearsley, A T; Stadermann, F J; Stroud, R M; Dai, Z R; Ishii, H A; Hutcheon, I D; Bajt, S; Snead, C J; Weber, P K; Bradley, J P

    2007-02-20

    Sample preparation is always a critical step in study of micrometer sized astromaterials available for study in the laboratory, whether their subsequent analysis is by electron microscopy or secondary ion mass spectrometry. A focused beam of gallium ions has been used to prepare electron transparent sections from an interplanetary dust particle, as part of an integrated analysis protocol to maximize the mineralogical, elemental, isotopic and spectroscopic information extracted from one individual particle. In addition, focused ion beam techniques have been employed to extract cometary residue preserved on the rims and walls of micro-craters in 1100 series aluminum foils that were wrapped around the sample tray assembly on the Stardust cometary sample collector. Non-ideal surface geometries and inconveniently located regions of interest required creative solutions. These include support pillar construction and relocation of a significant portion of sample to access a region of interest. Serial sectioning, in a manner similar to ultramicrotomy, is a significant development and further demonstrates the unique capabilities of focused ion beam microscopy for sample preparation of astromaterials.

  17. Applying spectral data analysis techniques to aquifer monitoring data in Belvoir Ranch, Wyoming

    Science.gov (United States)

    Gao, F.; He, S.; Zhang, Y.

    2017-12-01

    This study uses spectral data analysis techniques to estimate the hydraulic parameters from water level fluctuation due to tide effect and barometric effect. All water level data used in this study are collected in Belvoir Ranch, Wyoming. Tide effect can be not only observed in coastal areas, but also in inland confined aquifers. The force caused by changing positions of sun and moon affects not only ocean but also solid earth. The tide effect has an oscillatory pumping or injection sequence to the aquifer, and can be observed from dense water level monitoring. Belvoir Ranch data are collected once per hour, thus is dense enough to capture the tide effect. First, transforming de-trended data from temporal domain to frequency domain with Fourier transform method. Then, the storage coefficient can be estimated using Bredehoeft-Jacob model. After this, analyze the gain function, which expresses the amplification and attenuation of the output signal, and derive barometric efficiency. Next, find effective porosity with storage coefficient and barometric efficiency with Jacob's model. Finally, estimate aquifer transmissivity and hydraulic conductivity using Paul Hsieh's method. The estimated hydraulic parameters are compared with those from traditional pumping data estimation. This study proves that hydraulic parameter can be estimated by only analyze water level data in frequency domain. It has the advantages of low cost and environmental friendly, thus should be considered for future use of hydraulic parameter estimations.

  18. Error analysis of the phase-shifting technique when applied to shadow moire

    International Nuclear Information System (INIS)

    Han, Changwoon; Han Bongtae

    2006-01-01

    An exact solution for the intensity distribution of shadow moire fringes produced by a broad spectrum light is presented. A mathematical study quantifies errors in fractional fringe orders determined by the phase-shifting technique, and its validity is corroborated experimentally. The errors vary cyclically as the distance between the reference grating and the specimen increases. The amplitude of the maximum error is approximately 0.017 fringe, which defines the theoretical limit of resolution enhancement offered by the phase-shifting technique

  19. Nuclear and conventional techniques applied to the analysis of Purhepecha metals of the Pareyon collection

    International Nuclear Information System (INIS)

    Mendez, U.; Tenorio C, D.; Ruvalcaba, J.L.; Lopez, J.A.

    2005-01-01

    The main objective of this investigation was to determine the composition and microstructure of 13 metallic devices by means of the nuclear techniques of PIXE, RBS and conventional; which were elaborated starting from copper and gold, and they were in the offering of a tarasc personage located in the 'Matamoros' porch in Uruapan, Michoacan, Mexico. (Author)

  20. Complementary analysis techniques applied on optimizing suspensions of yttria stabilized zirconia

    DEFF Research Database (Denmark)

    Della Negra, Michela; Foghmoes, Søren Preben Vagn; Klemensø, Trine

    2016-01-01

    Three different polymers with different functional groups and similar molecular weight were tested as dispersing agents for suspensions of yttria stabilized zirconia in ethanol: polyvinyl pyrrolidone, polyethylene imine, polyvinyl butyral/acetal. The stability of the system was assessed consideri...... excellent performance of polyvinyl pyrrolidone and polyethylene imine as dispersing agents. The stability and dispersing power were finally utilized for preparing concentrated suspensions for tape casting and subsequently to sinter the tapes into dense ceramic pieces......., in details, all the processing steps, including suspension de-agglomeration, slurry manipulation, quality of sintered tapes microstructure, and final layer leak tightness. Different analytical techniques were used to monitor ceramic de-agglomeration and stability as a function of time, for different types...

  1. Objective frontal analysis techniques applied to extreme/non-extreme precipitation events

    Czech Academy of Sciences Publication Activity Database

    Kašpar, Marek

    2003-01-01

    Roč. 47, - (2003), s. 605-631 ISSN 0039-3169 R&D Projects: GA ČR GA205/00/1451 Institutional research plan: CEZ:AV0Z3042911 Keywords : NWP model * synoptic scale * objective analysis * atmospheric front * frontogenesis Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 0.426, year: 2003

  2. ANALYSIS DATA SETS USING HYBRID TECHNIQUES APPLIED ARTIFICIAL INTELLIGENCE BASED PRODUCTION SYSTEMS INTEGRATED DESIGN

    Directory of Open Access Journals (Sweden)

    Daniel-Petru GHENCEA

    2017-06-01

    Full Text Available The paper proposes a prediction model of behavior spindle from the point of view of the thermal deformations and the level of the vibrations by highlighting and processing the characteristic equations. This is a model analysis for the shaft with similar electro-mechanical characteristics can be achieved using a hybrid analysis based on artificial intelligence (genetic algorithms - artificial neural networks - fuzzy logic. The paper presents a prediction mode obtaining valid range of values for spindles with similar characteristics based on measured data sets from a few spindles test without additional measures being required. Extracting polynomial functions of graphs resulting from simultaneous measurements and predict the dynamics of the two features with multi-objective criterion is the main advantage of this method.

  3. Statistical techniques applied to aerial radiometric surveys (STAARS): principal components analysis user's manual

    International Nuclear Information System (INIS)

    Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.

    1981-01-01

    A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From this analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained

  4. Pre-analysis techniques applied to area-based correlation aiming Digital Terrain Model generation

    Directory of Open Access Journals (Sweden)

    Maurício Galo

    2005-12-01

    Full Text Available Area-based matching is an useful procedure in some photogrammetric processes and its results are of crucial importance in applications such as relative orientation, phototriangulation and Digital Terrain Model generation. The successful determination of correspondence depends on radiometric and geometric factors. Considering these aspects, the use of procedures that previously estimate the quality of the parameters to be computed is a relevant issue. This paper describes these procedures and it is shown that the quality prediction can be computed before performing matching by correlation, trough the analysis of the reference window. This procedure can be incorporated in the correspondence process for Digital Terrain Model generation and Phototriangulation. The proposed approach comprises the estimation of the variance matrix of the translations from the gray levels in the reference window and the reduction of the search space using the knowledge of the epipolar geometry. As a consequence, the correlation process becomes more reliable, avoiding the application of matching procedures in doubtful areas. Some experiments with simulated and real data are presented, evidencing the efficiency of the studied strategy.

  5. Analysis and Design of International Emission Trading Markets Applying System Dynamics Techniques

    Science.gov (United States)

    Hu, Bo; Pickl, Stefan

    2010-11-01

    The design and analysis of international emission trading markets is an important actual challenge. Time-discrete models are needed to understand and optimize these procedures. We give an introduction into this scientific area and present actual modeling approaches. Furthermore, we develop a model which is embedded in a holistic problem solution. Measures for energy efficiency are characterized. The economic time-discrete "cap-and-trade" mechanism is influenced by various underlying anticipatory effects. With a systematic dynamic approach the effects can be examined. First numerical results show that fair international emissions trading can only be conducted with the use of protective export duties. Furthermore a comparatively high price which evokes emission reduction inevitably has an inhibiting effect on economic growth according to our model. As it always has been expected it is not without difficulty to find a balance between economic growth and emission reduction. It can be anticipated using our System Dynamics model simulation that substantial changes must be taken place before international emissions trading markets can contribute to global GHG emissions mitigation.

  6. A comparison of quantitative reconstruction techniques for PIXE-tomography analysis applied to biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Beasley, D.G., E-mail: dgbeasley@ctn.ist.utl.pt [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Alves, L.C. [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Barberet, Ph.; Bourret, S.; Devès, G.; Gordillo, N.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Le Trequesser, Q. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Institut de Chimie de la Matière Condensée de Bordeaux (ICMCB, UPR9048) CNRS, Université de Bordeaux, 87 avenue du Dr. A. Schweitzer, Pessac F-33608 (France); Marques, A.C. [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Silva, R.C. da [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal)

    2014-07-15

    The tomographic reconstruction of biological specimens requires robust algorithms, able to deal with low density contrast and low element concentrations. At the IST/ITN microprobe facility new GPU-accelerated reconstruction software, JPIXET, has been developed, which can significantly increase the speed of quantitative reconstruction of Proton Induced X-ray Emission Tomography (PIXE-T) data. It has a user-friendly graphical user interface for pre-processing, data analysis and reconstruction of PIXE-T and Scanning Transmission Ion Microscopy Tomography (STIM-T). The reconstruction of PIXE-T data is performed using either an algorithm based on a GPU-accelerated version of the Maximum Likelihood Expectation Maximisation (MLEM) method or a GPU-accelerated version of the Discrete Image Space Reconstruction Algorithm (DISRA) (Sakellariou (2001) [2]). The original DISRA, its accelerated version, and the MLEM algorithm, were compared for the reconstruction of a biological sample of Caenorhabditis elegans – a small worm. This sample was analysed at the microbeam line of the AIFIRA facility of CENBG, Bordeaux. A qualitative PIXE-T reconstruction was obtained using the CENBG software package TomoRebuild (Habchi et al. (2013) [6]). The effects of pre-processing and experimental conditions on the elemental concentrations are discussed.

  7. Analysis of Arbitrary Reflector Antennas Applying the Geometrical Theory of Diffraction Together with the Master Points Technique

    Directory of Open Access Journals (Sweden)

    María Jesús Algar

    2013-01-01

    Full Text Available An efficient approach for the analysis of surface conformed reflector antennas fed arbitrarily is presented. The near field in a large number of sampling points in the aperture of the reflector is obtained applying the Geometrical Theory of Diffraction (GTD. A new technique named Master Points has been developed to reduce the complexity of the ray-tracing computations. The combination of both GTD and Master Points reduces the time requirements of this kind of analysis. To validate the new approach, several reflectors and the effects on the radiation pattern caused by shifting the feed and introducing different obstacles have been considered concerning both simple and complex geometries. The results of these analyses have been compared with the Method of Moments (MoM results.

  8. Results of Applying Cultural Domain Analysis Techniques and Implications for the Design of Complementary Feeding Interventions in Northern Senegal.

    Science.gov (United States)

    Zobrist, Stephanie; Kalra, Nikhila; Pelto, Gretel; Wittenbrink, Brittney; Milani, Peiman; Diallo, Abdoulaye Moussa; Ndoye, Tidiane; Wone, Issa; Parker, Megan

    2017-12-01

    Designing effective nutrition interventions for infants and young children requires knowledge about the population to which the intervention is directed, including insights into the cognitive systems and values that inform caregiver feeding practices. To apply cultural domain analysis techniques in the context of implementation research for the purpose of understanding caregivers' knowledge frameworks in Northern Senegal with respect to infant and young child (IYC) feeding. This study was intended to inform decisions for interventions to improve infant and young child nutrition. Modules from the Focused Ethnographic Study for Infant and Young Child Feeding Manual were employed in interviews with a sample of 126 key informants and caregivers from rural and peri-urban sites in the Saint-Louis region of northern Senegal. Descriptive statistics, cluster analysis, and qualitative thematic analysis were used to analyze the data. Cluster analysis showed that caregivers identified 6 food clusters: heavy foods, light foods, snack foods, foraged foods, packaged foods, and foods that are good for the body. The study also revealed similarities and differences between the 2 study sites in caregivers' knowledge frameworks. The demonstration of differences between biomedical concepts of nutrition and the knowledge frameworks of northern Senegalese women with regard to IYC feeding highlights the value of knowledge about emic perspectives of local communities to help guide decisions about interventions to improve nutrition.

  9. Applying CFD in the Analysis of Heavy-Oil Transportation in Curved Pipes Using Core-Flow Technique

    Directory of Open Access Journals (Sweden)

    S Conceição

    2017-06-01

    Full Text Available Multiphase flow of oil, gas and water occurs in the petroleum industry from the reservoir to the processing units. The occurrence of heavy oils in the world is increasing significantly and points to the need for greater investment in the reservoirs exploitation and, consequently, to the development of new technologies for the production and transport of this oil. Therefore, it is interesting improve techniques to ensure an increase in energy efficiency in the transport of this oil. The core-flow technique is one of the most advantageous methods of lifting and transporting of oil. The core-flow technique does not alter the oil viscosity, but change the flow pattern and thus, reducing friction during heavy oil transportation. This flow pattern is characterized by a fine water pellicle that is formed close to the inner wall of the pipe, aging as lubricant of the oil flowing in the core of the pipe. In this sense, the objective of this paper is to study the isothermal flow of heavy oil in curved pipelines, employing the core-flow technique. A three-dimensional, transient and isothermal mathematical model that considers the mixture and k-e  turbulence models to address the gas-water-heavy oil three-phase flow in the pipe was applied for analysis. Simulations with different flow patterns of the involved phases (oil-gas-water have been done, in order to optimize the transport of heavy oils. Results of pressure and volumetric fraction distribution of the involved phases are presented and analyzed. It was verified that the oil core lubricated by a fine water layer flowing in the pipe considerably decreases pressure drop.

  10. Processing techniques applying laser technology

    International Nuclear Information System (INIS)

    Yamada, Yuji; Makino Yoshinobu

    2000-01-01

    The requirements for the processing of nuclear energy equipment include high precision, low distortion, and low heat input. Toshiba has developed laser processing techniques for cutting, welding, and surface heat treatment of nuclear energy equipment because the zone affected by distortion and heat in laser processing is very small. Laser processing contributes to the manufacturing of high-quality and high-reliability equipment and reduces the manufacturing period. (author)

  11. Judging complex movement performances for excellence: a principal components analysis-based technique applied to competitive diving.

    Science.gov (United States)

    Young, Cole; Reinkensmeyer, David J

    2014-08-01

    Athletes rely on subjective assessment of complex movements from coaches and judges to improve their motor skills. In some sports, such as diving, snowboard half pipe, gymnastics, and figure skating, subjective scoring forms the basis for competition. It is currently unclear whether this scoring process can be mathematically modeled; doing so could provide insight into what motor skill is. Principal components analysis has been proposed as a motion analysis method for identifying fundamental units of coordination. We used PCA to analyze movement quality of dives taken from USA Diving's 2009 World Team Selection Camp, first identifying eigenpostures associated with dives, and then using the eigenpostures and their temporal weighting coefficients, as well as elements commonly assumed to affect scoring - gross body path, splash area, and board tip motion - to identify eigendives. Within this eigendive space we predicted actual judges' scores using linear regression. This technique rated dives with accuracy comparable to the human judges. The temporal weighting of the eigenpostures, body center path, splash area, and board tip motion affected the score, but not the eigenpostures themselves. These results illustrate that (1) subjective scoring in a competitive diving event can be mathematically modeled; (2) the elements commonly assumed to affect dive scoring actually do affect scoring (3) skill in elite diving is more associated with the gross body path and the effect of the movement on the board and water than the units of coordination that PCA extracts, which might reflect the high level of technique these divers had achieved. We also illustrate how eigendives can be used to produce dive animations that an observer can distort continuously from poor to excellent, which is a novel approach to performance visualization. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  13. Applied Behavior Analysis

    Science.gov (United States)

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  14. Applied nonstandard analysis

    CERN Document Server

    Davis, Martin

    2005-01-01

    Geared toward upper-level undergraduates and graduate students, this text explores the applications of nonstandard analysis without assuming any knowledge of mathematical logic. It develops the key techniques of nonstandard analysis at the outset from a single, powerful construction; then, beginning with a nonstandard construction of the real number system, it leads students through a nonstandard treatment of the basic topics of elementary real analysis, topological spaces, and Hilbert space.Important topics include nonstandard treatments of equicontinuity, nonmeasurable sets, and the existenc

  15. Cointegration versus traditional econometric techniques in applied economics

    OpenAIRE

    Joachim Zietz

    2000-01-01

    The paper illustrates some of the well-known problems with cointegration analysis in order to provide some perspective on the usefulness of cointegration techniques in applied economics. A number of numerical examples are employed to compare econometric estimation on the basis of both traditional autoregressive distributed lag models and currently popular cointegration techniques. The results suggest that, first, cointegration techniques need to be applied with great care and that, second, th...

  16. New technique of insitu soil moisture sampling for environmental isotope analysis applied at 'Pilat-dune' near Bordeaux

    International Nuclear Information System (INIS)

    Thoma, G.; Esser, N.; Sonntag, C.; Weiss, W.; Rudolph, J.; Leveque, P.

    1978-01-01

    A new soil-air suction method with soil water vapor adsorption by 4 A-molecular sieve provides soil moisture samples from various depths for environmental isotope analysis and yields soil temperature profiles. A field tritium tracer experiment shows that this insitu sampling method has an isotope profile resolution of about 5-10 cm only. Application of this method in the Pilat sand dune (Bordeaux/France) yielded deuterium and tritium profiles down to 25 meters depth. Bomb tritium measurements of monthly lysimeter percolate samples available since 1961 show that the tritium response has a mean delay of 5 months in case of a sand lysimeter and of 2.5 years for a loess loam lysimeter. A simple HETP model simulates the layered downward movement of soil water and the longitudinal dispersion in the lysimeters. Field capacity and evapotranspiration taken as open parameters yield tritium concentration values of the lysimeters' percolate which are in close agreement with the experimental results. Based on local meteorological data the HETP model applied to tritium tracer experiments in the unsaturated zone further yiels an individual prediction of the momentary tracer position and of the soil moisture distribution. This prediction can be checked experimentally at selected intervals by coring. (orig.) [de

  17. Functional reasoning, explanation and analysis: Part 1: a survey on theories, techniques and applied systems. Part 2: qualitative function formation technique

    International Nuclear Information System (INIS)

    Far, B.H.

    1992-01-01

    Functional Reasoning (FR) enables people to derive the purpose of objects and explain their functions, JAERI's 'Human Acts Simulation Program (HASP)', started from 1987, has the goal of developing programs of the underlying technologies for intelligent robots by imitating the intelligent behavior of humans. FR is considered a useful reasoning method in HASP and applied to understand function of tools and objects in the Toolbox Project. In this report, first, the results of the diverse FR researches within a variety of disciplines are reviewed and the common core and basic problems are identified. Then the qualitative function formation (QFF) technique is introduced. Some novel points are: extending the common qualitative models to include interactions and timing of events by defining temporal and dependency constraints, and binding it with the conventional qualitative simulation. Function concepts are defined as interpretations of either a persistence or an order in the sequence of states, using the trace of the qualitative state vector derived by qualitative simulation on the extended qualitative model. This offers solution to some of the FR problems and leads to a method for generalization and comparison of functions of different objects. (author) 85 refs

  18. Early counterpulse technique applied to vacuum interrupters

    International Nuclear Information System (INIS)

    Warren, R.W.

    1979-11-01

    Interruption of dc currents using counterpulse techniques is investigated with vacuum interrupters and a novel approach in which the counterpulse is applied before contact separation. Important increases have been achieved in this way in the maximum interruptible current as well as large reductions in contact erosion. The factors establishing these new limits are presented and ways are discussed to make further improvements

  19. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  20. Geomatics techniques applied to time series of aerial images for multitemporal geomorphological analysis of the Miage Glacier (Mont Blanc).

    Science.gov (United States)

    Perotti, Luigi; Carletti, Roberto; Giardino, Marco; Mortara, Giovanni

    2010-05-01

    The Miage glacier is the major one in the Italian side of the Mont Blanc Massif, the third by area and the first by longitudinal extent among Italian glaciers. It is a typical debris covered glacier, since the end of the L.I.A. The debris coverage reduces ablation, allowing a relative stability of the glacier terminus, which is characterized by a wide and articulated moraine apparatus. For its conservative landforms, the Miage Glacier has a great importance for the analysis of the geomorphological response to recent climatic changes. Thanks to an organized existing archive of multitemporal aerial images (1935 to present) a photogrammetric approach has been applied to detect recent geomorphological changes in the Miage glacial basin. The research team provided: a) to digitize all the available images (still in analogic form) through photogrammetric scanners (very low image distortions devices) taking care of correctly defining the resolution of the acquisition compared to the scale mapping images are suitable for; b) to import digitized images into an appropriate digital photogrammetry software environment; c) to manage images in order, where possible, to carried out the stereo models orientation necessary for 3D navigation and plotting of critical geometric features of the glacier. Recognized geometric feature, referring to different periods, can be transferred to vector layers and imported in a GIS for further comparisons and investigations; d) to produce multi-temporal Digital Elevation Models for glacier volume changes; e) to perform orthoprojection of such images to obtain multitemporal orthoimages useful for areal an planar terrain evaluation and thematic analysis; f) to evaluate both planimetric positioning and height determination accuracies reachable through the photogrammetric process. Users have to known reliability of the measures they can do over such products. This can drive them to define the applicable field of this approach and this can help them to

  1. Applied survival analysis using R

    CERN Document Server

    Moore, Dirk F

    2016-01-01

    Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

  2. Nuclear techniques (PIXE and RBS) applied to analysis of pre hispanic metals of the Templo Mayor at Tenochtitlan

    International Nuclear Information System (INIS)

    Mendez U, I.; Tenorio, D.; Galvan, J.L.

    2000-01-01

    This work has the objective of determining by means of the utilization of nuclear techniques (PIXE and RBS) the composition and the alloy type of diverse aztec ornaments corresponding to Post classic period, they were manufactured principally with copper and gold such as bells, beads and disks; all they belonging at 9 oblations of Templo Mayor of Tenochtitlan. It is presented here briefly the historical and archaeological antecedents of the devices as well as the analytical methods for conclude with the results obtained. (Author)

  3. Analysis of the accelerated crucible rotation technique applied to the gradient freeze growth of cadmium zinc telluride

    Science.gov (United States)

    Divecha, Mia S.; Derby, Jeffrey J.

    2017-06-01

    We employ finite-element modeling to assess the effects of the accelerated crucible rotation technique (ACRT) on cadmium zinc telluride (CZT) crystals grown from a gradient freeze system. Via consideration of tellurium segregation and transport, we show, for the first time, that steady growth from a tellurium-rich melt produces persistent undercooling in front of the growth interface, likely leading to morphological instability. The application of ACRT rearranges melt flows and tellurium transport but, in contrast to conventional wisdom, does not altogether eliminate undercooling of the melt. Rather, a much more complicated picture arises, where spatio-temporal realignment of undercooled melt may act to locally suppress instability. A better understanding of these mechanisms and quantification of their overall effects will allow for future growth optimization.

  4. Modern problems in applied analysis

    CERN Document Server

    Rogosin, Sergei

    2018-01-01

    This book features a collection of recent findings in Applied Real and Complex Analysis that were presented at the 3rd International Conference “Boundary Value Problems, Functional Equations and Applications” (BAF-3), held in Rzeszow, Poland on 20-23 April 2016. The contributions presented here develop a technique related to the scope of the workshop and touching on the fields of differential and functional equations, complex and real analysis, with a special emphasis on topics related to boundary value problems. Further, the papers discuss various applications of the technique, mainly in solid mechanics (crack propagation, conductivity of composite materials), biomechanics (viscoelastic behavior of the periodontal ligament, modeling of swarms) and fluid dynamics (Stokes and Brinkman type flows, Hele-Shaw type flows). The book is addressed to all readers who are interested in the development and application of innovative research results that can help solve theoretical and real-world problems.

  5. Applied Behavior Analysis in Education.

    Science.gov (United States)

    Cooper, John O.

    1982-01-01

    Applied behavioral analysis in education is expanding rapidly. This article describes the dimensions of applied behavior analysis and the contributions this technology offers teachers in the area of systematic applications, direct and daily measurement, and experimental methodology. (CJ)

  6. Motion Capture Technique Applied Research in Sports Technique Diagnosis

    Directory of Open Access Journals (Sweden)

    Zhiwu LIU

    2014-09-01

    Full Text Available The motion capture technology system definition is described in the paper, and its components are researched, the key parameters are obtained from motion technique, the quantitative analysis are made on technical movements, the method of motion capture technology is proposed in sport technical diagnosis. That motion capture step includes calibration system, to attached landmarks to the tester; to capture trajectory, and to analyze the collected data.

  7. Applying DEA Technique to Library Evaluation in Academic Research Libraries.

    Science.gov (United States)

    Shim, Wonsik

    2003-01-01

    This study applied an analytical technique called Data Envelopment Analysis (DEA) to calculate the relative technical efficiency of 95 academic research libraries, all members of the Association of Research Libraries. DEA, with the proper model of library inputs and outputs, can reveal best practices in the peer groups, as well as the technical…

  8. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  9. Nuclear analytical techniques applied to forensic chemistry

    International Nuclear Information System (INIS)

    Nicolau, Veronica; Montoro, Silvia; Pratta, Nora; Giandomenico, Angel Di

    1999-01-01

    Gun shot residues produced by firing guns are mainly composed by visible particles. The individual characterization of these particles allows distinguishing those ones containing heavy metals, from gun shot residues, from those having a different origin or history. In this work, the results obtained from the study of gun shot residues particles collected from hands are presented. The aim of the analysis is to establish whether a person has shot a firing gun has been in contact with one after the shot has been produced. As reference samples, particles collected hands of persons affected to different activities were studied to make comparisons. The complete study was based on the application of nuclear analytical techniques such as Scanning Electron Microscopy, Energy Dispersive X Ray Electron Probe Microanalysis and Graphite Furnace Atomic Absorption Spectrometry. The essays allow to be completed within time compatible with the forensic requirements. (author)

  10. Applied multivariate statistical analysis

    National Research Council Canada - National Science Library

    Johnson, Richard Arnold; Wichern, Dean W

    1988-01-01

    .... The authors hope that their discussions will meet the needs of experimental scientists, in a wide variety of subject matter areas, as a readable introduciton to the staistical analysis of multvariate observations...

  11. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  12. Applying Brainstorming Techniques to EFL Classroom

    OpenAIRE

    Toshiya, Oishi; 湘北短期大学; aPart-time Lecturer at Shohoku College

    2015-01-01

    This paper focuses on brainstorming techniques for English language learners. From the author's teaching experiences at Shohoku College during the academic year 2014-2015, the importance of brainstorming techniques was made evident. The author explored three elements of brainstorming techniques for writing using literaturereviews: lack of awareness, connecting to prior knowledge, and creativity. The literature reviews showed the advantage of using brainstorming techniques in an English compos...

  13. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    Science.gov (United States)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  14. Nuclear radioactive techniques applied to materials research

    CERN Document Server

    Correia, João Guilherme; Wahl, Ulrich

    2011-01-01

    In this paper we review materials characterization techniques using radioactive isotopes at the ISOLDE/CERN facility. At ISOLDE intense beams of chemically clean radioactive isotopes are provided by selective ion-sources and high-resolution isotope separators, which are coupled on-line with particle accelerators. There, new experiments are performed by an increasing number of materials researchers, which use nuclear spectroscopic techniques such as Mössbauer, Perturbed Angular Correlations (PAC), beta-NMR and Emission Channeling with short-lived isotopes not available elsewhere. Additionally, diffusion studies and traditionally non-radioactive techniques as Deep Level Transient Spectroscopy, Hall effect and Photoluminescence measurements are performed on radioactive doped samples, providing in this way the element signature upon correlation of the time dependence of the signal with the isotope transmutation half-life. Current developments, applications and perspectives of using radioactive ion beams and tech...

  15. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  16. Laser and plasma dental soldering techniques applied to Ti-6Al-4V alloy: ultimate tensile strength and finite element analysis.

    Science.gov (United States)

    Castro, Morgana G; Araújo, Cleudmar A; Menegaz, Gabriela L; Silva, João Paulo L; Nóbilo, Mauro Antônio A; Simamoto Júnior, Paulo Cézar

    2015-05-01

    The literature provides limited information regarding the performance of Ti-6Al-4V laser and plasma joints welded in prefabricated bars in dental applications. The purpose of this study was to evaluate the mechanical strength of different diameters of Ti-6Al-4V alloy welded with laser and plasma techniques. Forty-five dumbbell-shaped rods were created from Ti-6Al-4V and divided into 9 groups (n=5): a control group with 3-mm and intact bars; groups PL2.5, PL3, PL4, and PL5 (specimens with 2.5-, 3-, 4-, and 5-mm diameters welded with plasma); and groups L2.5, L3, L4, and L5 (specimens with 2.5-, 3-, 4-, and 5-mm diameters welded with laser). The specimens were tested for ultimate tensile strength (UTS), and elongation percentages (EP) were obtained. Fractured specimens were analyzed by stereomicroscopy, and welded area percentages (WAP) were calculated. Images were made with scanning electron microscopy. In the initial analysis, the data were analyzed with a 2-way ANOVA (2×4) and the Tukey Honestly Significant Difference (HSD) test. In the second analysis, the UTS and EP data were analyzed with 1-way ANOVA, and the Dunnett test was used to compare the 4 experimental groups with the control group (α=.05). The Pearson and Spearman correlation coefficient tests were applied to correlate the study factors. Finite element models were developed in a workbench environment with boundary conditions simulating those of a tensile test. The 2-way ANOVA showed that the factors welding type and diameter were significant for the UTS and WAP values. However, the interaction between them was not significant. The 1-way ANOVA showed statistically significant differences among the groups for UTS, WAP, and EP values. The Dunnett test showed that all the tested groups had lower UTS and EP values than the control group. The 2.5- and 3-mm diameter groups showed higher values for UTS and WAP than the other test groups. A positive correlation was found between welded area percentage and UTS

  17. Conversation Analysis in Applied Linguistics

    DEFF Research Database (Denmark)

    Kasper, Gabriele; Wagner, Johannes

    2014-01-01

    For the last decade, conversation analysis (CA) has increasingly contributed to several established fields in applied linguistics. In this article, we will discuss its methodological contributions. The article distinguishes between basic and applied CA. Basic CA is a sociological endeavor concerned...

  18. Basic principles of applied nuclear techniques

    International Nuclear Information System (INIS)

    Basson, J.K.

    1976-01-01

    The technological applications of radioactive isotopes and radiation in South Africa have grown steadily since the first consignment of man-made radioisotopes reached this country in 1948. By the end of 1975 there were 412 authorised non-medical organisations (327 industries) using hundreds of sealed sources as well as their fair share of the thousands of radioisotope consignments, annually either imported or produced locally (mainly for medical purposes). Consequently, it is necessary for South African technologists to understand the principles of radioactivity in order to appreciate the industrial applications of nuclear techniques [af

  19. Dosimetry techniques applied to thermoluminescent age estimation

    International Nuclear Information System (INIS)

    Erramli, H.

    1986-12-01

    The reliability and the ease of the field application of the measuring techniques of natural radioactivity dosimetry are studied. The natural radioactivity in minerals in composed of the internal dose deposited by alpha and beta radiations issued from the sample itself and the external dose deposited by gamma and cosmic radiations issued from the surroundings of the sample. Two technics for external dosimetry are examined in details. TL Dosimetry and field gamma dosimetry. Calibration and experimental conditions are presented. A new integrated dosimetric method for internal and external dose measure is proposed: the TL dosimeter is placed in the soil in exactly the same conditions as the sample ones, during a time long enough for the total dose evaluation [fr

  20. Applying Failure Modes, Effects, And Criticality Analysis And Human Reliability Analysis Techniques To Improve Safety Design Of Work Process In Singapore Armed Forces

    Science.gov (United States)

    2016-09-01

    suitable for this work process. Table 18 shows a comparison of the advantages and disadvantages for the remaining HRA methods. 47 Table 18. Comparison...aims to study the feasibility of adopting HEART as an alternate hazard assessment technique for human-centric work processes. 10 B. METHODOLOGY...of THERP, HEART, and SPAR-H. Adapted from Bell and Holroyd (2009). Advantages Disadvantages THERP - THERP is well used in practice - It has a

  1. Technique applied in electrical power distribution for Satellite Launch Vehicle

    Directory of Open Access Journals (Sweden)

    João Maurício Rosário

    2010-09-01

    Full Text Available The Satellite Launch Vehicle electrical network, which is currently being developed in Brazil, is sub-divided for analysis in the following parts: Service Electrical Network, Controlling Electrical Network, Safety Electrical Network and Telemetry Electrical Network. During the pre-launching and launching phases, these electrical networks are associated electrically and mechanically to the structure of the vehicle. In order to succeed in the integration of these electrical networks it is necessary to employ techniques of electrical power distribution, which are proper to Launch Vehicle systems. This work presents the most important techniques to be considered in the characterization of the electrical power supply applied to Launch Vehicle systems. Such techniques are primarily designed to allow the electrical networks, when submitted to the single-phase fault to ground, to be able of keeping the power supply to the loads.

  2. Applying the digital-image-correlation technique to measure the ...

    Indian Academy of Sciences (India)

    digital-image-correlation) technique is used to measure the deformation of the retrofitted column. The result shows that the DIC technique can be successfully applied to measure the relative displacement of the column. Additionally, thismethod ...

  3. Applied analysis and differential equations

    CERN Document Server

    Cârj, Ovidiu

    2007-01-01

    This volume contains refereed research articles written by experts in the field of applied analysis, differential equations and related topics. Well-known leading mathematicians worldwide and prominent young scientists cover a diverse range of topics, including the most exciting recent developments. A broad range of topics of recent interest are treated: existence, uniqueness, viability, asymptotic stability, viscosity solutions, controllability and numerical analysis for ODE, PDE and stochastic equations. The scope of the book is wide, ranging from pure mathematics to various applied fields such as classical mechanics, biomedicine, and population dynamics.

  4. Analytical techniques applied to study cultural heritage objects

    International Nuclear Information System (INIS)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N.

    2015-01-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  5. Analytical techniques applied to study cultural heritage objects

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N., E-mail: rizzutto@if.usp.br [Universidade de Sao Paulo (USP), SP (Brazil). Instituto de Fisica

    2015-07-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  6. Archaeometry: nuclear and conventional techniques applied to the archaeological research

    International Nuclear Information System (INIS)

    Esparza L, R.; Cardenas G, E.

    2005-01-01

    The book that now is presented is formed by twelve articles that approach from different perspective topics as the archaeological prospecting, the analysis of the pre hispanic and colonial ceramic, the obsidian and the mural painting, besides dating and questions about the data ordaining. Following the chronological order in which the exploration techniques and laboratory studies are required, there are presented in the first place the texts about the systematic and detailed study of the archaeological sites, later we pass to relative topics to the application of diverse nuclear techniques as PIXE, RBS, XRD, NAA, SEM, Moessbauer spectroscopy and other conventional techniques. The multidisciplinary is an aspect that highlights in this work, that which owes to the great specialization of the work that is presented even in the archaeological studies including in the open ground of the topography, mapping, excavation and, of course, in the laboratory tests. Most of the articles are the result of several years of investigation and it has been consigned in the responsibility of each article. The texts here gathered emphasize the technical aspects of each investigation, the modern compute systems applied to the prospecting and the archaeological mapping, the chemical and physical analysis of organic materials, of metal artifacts, of diverse rocks used in the pre hispanic epoch, of mural and ceramic paintings, characteristics that justly underline the potential of the collective works. (Author)

  7. Tensometry technique for X-ray diffraction in applied analysis of welding; Tensometria por tecnica de difracao de raios X aplicada na analise de soldagens

    Energy Technology Data Exchange (ETDEWEB)

    Turibus, S.N.; Caldas, F.C.M.; Miranda, D.M.; Monine, V.I.; Assis, J.T., E-mail: snturibus@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (IPRJ/UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico

    2010-07-01

    This paper presents the analysis of residual stress introduced in welding process. As the stress in a material can induce damages, it is necessary to have a method to identify this residual stress state. For this it was used the non-destructive X-ray diffraction technique to analyze two plates from A36 steel jointed by metal inert gas (MIG) welding. The stress measurements were made by the sin{sup 2{psi}} method in weld region of steel plates including analysis of longitudinal and transverse residual stresses in fusion zone, heat affected zone (HAZ) and base metal. To determine the stress distribution along the depth of the welded material it was used removing of superficial layers made by electropolishing. (author)

  8. Analysis of metal concentration levels in water, sediment and fish tissues from Toledo municipal lake by applying SR-TXRF technique.

    Science.gov (United States)

    Espinoza-Quiñones, F R; Módenes, A N; Palácio, S M; Lorenz, E K; Oliveira, A P

    2011-01-01

    The main objective of this study was to evaluate the metal content in water and sediment from the Toledo municipal lake, as well as the concentration levels of heavy metals in muscle and liver of four fish species. A digestion procedure was performed in all fish samples. Metal analysis was performed by using the Synchrotron Radiation X-ray Fluorescence technique. The accuracy and validity of the measurements were determined by analysis of certified reference materials. The highest Cr, Cu and Se concentration levels above the maximum tolerance limit according to the Brazilian norms in fish tissue could be associated with the metal uptake and accumulation due to the direct contact with contaminated water and sediment.

  9. Construction and performance characterization of ion-selective electrodes for potentiometric determination of pseudoephedrine hydrochloride applying batch and flow injection analysis techniques.

    Science.gov (United States)

    Zayed, Sayed I M; Issa, Yousry M; Hussein, Ahmed

    2006-01-01

    New pseudoephedrine selective electrodes have been constructed of the conventional polymer membrane type by incorporation of pseudoephedrine-phosphotungstate (PE-PT) or pseudoephedrine-silicotungstate (PE-SiT) ion-associates in a poly vinyl chloride (PVC) membrane plasticized with dibutyl phthalate (DBP). The electrodes were fully characterized in terms of the membrane composition, temperature, and pH. The electrodes exhibited mean slopes of calibration graphs of 57.09 and 56.10 mV concentration decade(-1) of PECl at 25 degrees C for (PE-PT) and (PE-SiT) electrodes, respectively. The electrodes showed fast, stable, and near-Nernstian response over the concentration ranges 6.31 x 10(-6)-1.00 x 10(-2) and 5.00 x 10(-5)-1.00x10(-2) M in the case of PE-PT applying batch and flow injection (FI) analysis, respectively, and 1.00 x 10(-5)-1.00 x 10(-2) and 5.00 x 10(-5)-1.00x10(-2) M in the case of PE-SiT for batch and FI analysis system, respectively. Detection limit was 5.01x 10(-6) M for PE-PT electrode and 6.31x10(-6) M for PE-SiT electrode. The electrodes were successfully applied for the potentiometric determination of pseudoephedrine hydrochloride (PECl) in pharmaceutical preparations with mean recovery 101.13 +/- 0.85% and 100.77+0.79% in case of PE-PT applying batch and flow injection systems, respectively, and 100.75+0.85% and 100.79 +/- 0.77% in case of PE-SiT for batch and flow injection systems, respectively. The electrodes exhibited good selectivity for PECl with respect to a large number of inorganic cations, sugars and amino acids.

  10. Applying Metrological Techniques to Satellite Fundamental Climate Data Records

    Science.gov (United States)

    Woolliams, Emma R.; Mittaz, Jonathan PD; Merchant, Christopher J.; Hunt, Samuel E.; Harris, Peter M.

    2018-02-01

    Quantifying long-term environmental variability, including climatic trends, requires decadal-scale time series of observations. The reliability of such trend analysis depends on the long-term stability of the data record, and understanding the sources of uncertainty in historic, current and future sensors. We give a brief overview on how metrological techniques can be applied to historical satellite data sets. In particular we discuss the implications of error correlation at different spatial and temporal scales and the forms of such correlation and consider how uncertainty is propagated with partial correlation. We give a form of the Law of Propagation of Uncertainties that considers the propagation of uncertainties associated with common errors to give the covariance associated with Earth observations in different spectral channels.

  11. Applying machine learning classification techniques to automate sky object cataloguing

    Science.gov (United States)

    Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav

    1993-08-01

    We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is

  12. Chemical vapor deposition: A technique for applying protective coatings

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, T.C. Sr.; Bowman, M.G.

    1979-01-01

    Chemical vapor deposition is discussed as a technique for applying coatings for materials protection in energy systems. The fundamentals of the process are emphasized in order to establish a basis for understanding the relative advantages and limitations of the technique. Several examples of the successful application of CVD coating are described. 31 refs., and 18 figs.

  13. Mainstreaming gender and promoting intersectionality in Papua New Guinea's health policy: a triangulated analysis applying data-mining and content analytic techniques.

    Science.gov (United States)

    Lamprell, G; Braithwaite, J

    2017-04-20

    Gender mainstreaming is an approach to policy and planning that emphasizes equality between the sexes. It is the stated policy for gender equity in Papua New Guinea's (PNG) health sector, as well as all other sectors, and is enshrined in the policies of its biggest aid givers. However, there is criticism that gender mainstreaming's application has too often been technocratic and lacking in conceptual clarity not only in PNG but elsewhere. In the health sector this is further exacerbated by a traditional bio-medical approach, which is often paternalistic and insufficiently patient- and family-centered. This study analyses the policy attitudes toward gender in PNG's health sector using both data-mining and a traditional, summative content analysis. Our results show that gender is rarely mentioned. When it is, it is most often mentioned in relation to programs such as maternity and childcare for women, and elsewhere is applied technocratically. For PNG to promote greater levels of equity, the focus should first be on conceptualizing gender in a way that is meaningful for Papuans, taking into account the diversity of experiences and setting. Second, there should be greater focus on activists and civil society groups as the stakeholders most likely to make a difference in gender equity.

  14. Applying Parallel Processing Techniques to Tether Dynamics Simulation

    Science.gov (United States)

    Wells, B. Earl

    1996-01-01

    The focus of this research has been to determine the effectiveness of applying parallel processing techniques to a sizable real-world problem, the simulation of the dynamics associated with a tether which connects two objects in low earth orbit, and to explore the degree to which the parallelization process can be automated through the creation of new software tools. The goal has been to utilize this specific application problem as a base to develop more generally applicable techniques.

  15. Applying the digital-image-correlation technique to measure the ...

    Indian Academy of Sciences (India)

    It has been applied for analysing various structural problems. For exam- ple, French scholars Raffard et ... observe the crack development in masonry wall. One major advantage of DIC technique ... based on the characteristic gray-scale distributions in the image of the structural speckle on the specimen surface. As shown in ...

  16. Fuzzy Control Technique Applied to Modified Mathematical Model ...

    African Journals Online (AJOL)

    In this paper, fuzzy control technique is applied to the modified mathematical model for malaria control presented by the authors in an earlier study. Five Mamdani fuzzy controllers are constructed to control the input (some epidemiological parameters) to the malaria model simulated by 9 fully nonlinear ordinary differential ...

  17. Harmonic Mitigation Techniques Applied to Power Distribution Networks

    Directory of Open Access Journals (Sweden)

    Hussein A. Kazem

    2013-01-01

    Full Text Available A growing number of harmonic mitigation techniques are now available including active and passive methods, and the selection of the best-suited technique for a particular case can be a complicated decision-making process. The performance of some of these techniques is largely dependent on system conditions, while others require extensive system analysis to prevent resonance problems and capacitor failure. A classification of the various available harmonic mitigation techniques is presented in this paper aimed at presenting a review of harmonic mitigation methods to researchers, designers, and engineers dealing with power distribution systems.

  18. Subcellular fractionation associated to radionuclide analysis in various tissues: validation of the technique by using light and electron observations applied on gills bivalves and uranium

    Energy Technology Data Exchange (ETDEWEB)

    Camilleri, V.; Simon, O.; Grasset, G. [CEA Cadarache (DEI/SECRE/LRE), Laboratory of Radioecology and Ecotoxicology, Institute for Radioprotection and Nuclear Safety, 13 - Saint-Paul-lez-Durance (France)

    2004-07-01

    The metal bioaccumulation levels in target-organs associated with micro-localization approaches at the subcellular level provide information for the understanding of the metabolic metal cycle. These findings could be used to select relevant bio-markers of exposure and to focus on specific contaminated organelles to study potential biological effects. Moreover, the metal accumulated in the cytosol fraction can be bound to macromolecules in order to be eliminated and/or to induce a potential cellular effect. Tissular distribution, transfer efficiency from water and subcellular fractionation were investigated on the freshwater bivalve, Corbicula fluminea after uranium aqueous exposure. The subcellular fractionation was performed while measuring associated uranium to each cellular different fraction as follows: cellular debris and nuclei, mitochondria and lysosomes, membranes, microsomes and cytosol. In our experimental conditions, the accumulation in the cytosol fraction was low and more than 80 % of the total uranium in gills and visceral mass was accumulated in the insoluble fraction. Main results presented in this poster come from light and electron microscope observations of subcellular fractions (nuclei/debris and lysosomes/mitochondria) in order to validate the efficiency of the fractionation technique. An adaptation of the fractionation technique is proposed. This set of data confirms high differences of fractionation efficiency as a function of fractionation technique and organs/biological model used (gills of bivalves, digestive gland of crayfish). (author)

  19. Satellite SAR interferometric techniques applied to emergency mapping

    Science.gov (United States)

    Stefanova Vassileva, Magdalena; Riccardi, Paolo; Lecci, Daniele; Giulio Tonolo, Fabio; Boccardo Boccardo, Piero; Chiesa, Giuliana; Angeluccetti, Irene

    2017-04-01

    This paper aim to investigate the capabilities of the currently available SAR interferometric algorithms in the field of emergency mapping. Several tests have been performed exploiting the Copernicus Sentinel-1 data using the COTS software ENVI/SARscape 5.3. Emergency Mapping can be defined as "creation of maps, geo-information products and spatial analyses dedicated to providing situational awareness emergency management and immediate crisis information for response by means of extraction of reference (pre-event) and crisis (post-event) geographic information/data from satellite or aerial imagery". The conventional differential SAR interferometric technique (DInSAR) and the two currently available multi-temporal SAR interferometric approaches, i.e. Permanent Scatterer Interferometry (PSI) and Small BAseline Subset (SBAS), have been applied to provide crisis information useful for the emergency management activities. Depending on the considered Emergency Management phase, it may be distinguished between rapid mapping, i.e. fast provision of geospatial data regarding the area affected for the immediate emergency response, and monitoring mapping, i.e. detection of phenomena for risk prevention and mitigation activities. In order to evaluate the potential and limitations of the aforementioned SAR interferometric approaches for the specific rapid and monitoring mapping application, five main factors have been taken into account: crisis information extracted, input data required, processing time and expected accuracy. The results highlight that DInSAR has the capacity to delineate areas affected by large and sudden deformations and fulfills most of the immediate response requirements. The main limiting factor of interferometry is the availability of suitable SAR acquisition immediately after the event (e.g. Sentinel-1 mission characterized by 6-day revisiting time may not always satisfy the immediate emergency request). PSI and SBAS techniques are suitable to produce

  20. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  1. A NOVEL TECHNIQUE APPLYING SPECTRAL ESTIMATION TO JOHNSON NOISE THERMOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    Ezell, N Dianne Bull [ORNL; Britton Jr, Charles L [ORNL; Roberts, Michael [ORNL; Holcomb, David Eugene [ORNL; Ericson, Milton Nance [ORNL; Djouadi, Seddik M [ORNL; Wood, Richard Thomas [ORNL

    2017-01-01

    Johnson noise thermometry (JNT) is one of many important measurements used to monitor the safety levels and stability in a nuclear reactor. However, this measurement is very dependent on the electromagnetic environment. Properly removing unwanted electromagnetic interference (EMI) is critical for accurate drift free temperature measurements. The two techniques developed by Oak Ridge National Laboratory (ORNL) to remove transient and periodic EMI are briefly discussed in this document. Spectral estimation is a key component in the signal processing algorithm utilized for EMI removal and temperature calculation. Applying these techniques requires the simple addition of the electronics and signal processing to existing resistive thermometers.

  2. Aspects of High-Resolution Gas Chromatography as Applied to the Analysis of Hydrocarbon Fuels and Other Complex Organic Mixtures. Volume 2. Survey of Sample Insertion Techniques.

    Science.gov (United States)

    1985-06-01

    fragrances , certain biological fluids, etc. In short, total analysis of the sample in its neat form is needed. At the present state of HRGC technology, the...Chromatog., 217:99, 1981. 41 U.S.Governme.nt Printing Office: 1985 - 559.065/20913 I

  3. Applying of USB interface technique in nuclear spectrum acquisition system

    International Nuclear Information System (INIS)

    Zhou Jianbin; Huang Jinhua

    2004-01-01

    This paper introduces applying of USB technique and constructing nuclear spectrum acquisition system via PC's USB interface. The authors choose the USB component USB100 module and the W77E58μc to do the key work. It's easy to apply USB interface technique, when USB100 module is used. USB100 module can be treated as a common I/O component for the μc controller, and can be treated as a communication interface (COM) when connected to PC' USB interface. It's easy to modify the PC's program for the new system with USB100 module. The authors can smoothly change from ISA, RS232 bus to USB bus. (authors)

  4. Applying CFD in the analysis of heavy oil - water two-phase flow in joints by using core annular flow technique

    Directory of Open Access Journals (Sweden)

    T Andrade

    2016-09-01

    Full Text Available In the oil industry the multiphase flow occur throughout the production chain, from reservoir rock until separation units through the production column, risers and pipelines. During the whole process the fluid flows through the horizontal pipes, curves, connections and T joints. Today, technological and economic challenges facing the oil industry is related to heavy oil transportation due to its unfavourable characteristics such as high viscosity and high density that provokes high pressure drop along the flow. The coreflow technique consists in the injection of small amounts of water into the pipe to form a ring of water between the oil and the wall of the pipe which provides the reduction of friction pressure drop along the flow. This paper aim to model and simulate the transient two-phase flow (water-heavy oil in a horizontal pipe and T joint by numerical simulation using the software ANSYS CFX® Release 12.0. Results of pressure and volumetric fraction distribution inside the horizontal pipe and T joint are presented and analysed.

  5. Analysis of a finite-difference and a Galerkin technique applied to the simulation of advection and diffusion of air pollutants from a line source

    International Nuclear Information System (INIS)

    Runca, E.; Melli, P.; Sardei, F.

    1985-01-01

    A finite-difference scheme and a Galerkin scheme are compared with respect to a very accurate solution describing time-dependent advection and diffusion of air pollutants from a line source in an atmosphere vertically stratified and limited by an inversion layer. The accurate solution was achieved by applying the finite-difference scheme on a very refined grid with a very small time step. The grid size and time step were defined according to stability and accuracy criteria discussed in the text. It is found that for the problem considered the two methods can be considered equally accurate. However, the Galerkin method gives a better approximation in the vicinity of the source. This was assumed to be partly due to the different way the source term is taken into account in the two methods. Improvement of the accuracy of the finite-difference scheme was achieved by approximating, at every step, the contribution of the source term by a Gaussian puff moving and diffusing with the velocity and diffusivity of the source location, instead of utilizing a stepwise function for the numerical approximation of the delta function representing the source term

  6. Applying recursive numerical integration techniques for solving high dimensional integrals

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, Andreas [IVU Traffic Technologies AG, Berlin (Germany); Genz, Alan [Washington State Univ., Pullman, WA (United States). Dept. of Mathematics; Hartung, Tobias [King' s College, London (United Kingdom). Dept. of Mathematics; Jansen, Karl; Volmer, Julia [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leoevey, Hernan [Humboldt Univ. Berlin (Germany). Inst. fuer Mathematik

    2016-11-15

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  7. Applying recursive numerical integration techniques for solving high dimensional integrals

    International Nuclear Information System (INIS)

    Ammon, Andreas; Genz, Alan; Hartung, Tobias; Jansen, Karl; Volmer, Julia; Leoevey, Hernan

    2016-11-01

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  8. Evaluation via multivariate techniques of scale factor variability in the rietveld method applied to quantitative phase analysis with X ray powder diffraction

    Directory of Open Access Journals (Sweden)

    Terezinha Ferreira de Oliveira

    2006-12-01

    Full Text Available The present work uses multivariate statistical analysis as a form of establishing the main sources of error in the Quantitative Phase Analysis (QPA using the Rietveld method. The quantitative determination of crystalline phases using x ray powder diffraction is a complex measurement process whose results are influenced by several factors. Ternary mixtures of Al2O3, MgO and NiO were prepared under controlled conditions and the diffractions were obtained using the Bragg-Brentano geometric arrangement. It was possible to establish four sources of critical variations: the experimental absorption and the scale factor of NiO, which is the phase with the greatest linear absorption coefficient of the ternary mixture; the instrumental characteristics represented by mechanical errors of the goniometer and sample displacement; the other two phases (Al2O3 and MgO; and the temperature and relative humidity of the air in the laboratory. The error sources excessively impair the QPA with the Rietveld method. Therefore it becomes necessary to control them during the measurement procedure.

  9. Ion backscattering techniques applied in materials science research

    International Nuclear Information System (INIS)

    Sood, D.K.

    1978-01-01

    The applications of Ion Backscattering Technique (IBT) to material analysis have expanded rapidly during the last decade. It is now regarded as an analysis tool indispensable for a versatile materials research program. The technique consists of simply shooting a beam of monoenergetic ions (usually 4 He + ions at about 2 MeV) onto a target, and measuring their energy distribution after backscattering at a fixed angle. Simple Rutherford scattering analysis of the backscattered ion spectrum yields information on the mass, the absolute amount and the depth profile of elements present upto a few microns of the target surface. The technique is nondestructive, quick, quantitative and the only known method of analysis which gives quantitative results without recourse to calibration standards. Its major limitations are the inability to separate elements of similar mass and a complete absence of chemical-binding information. A typical experimental set up and spectrum analysis have been described. Examples, some of them based on the work at the Bhabha Atomic Research Centre, Bombay, have been given to illustrate the applications of this technique to semiconductor technology, thin film materials science and nuclear energy materials. Limitations of IBT have been illustrated and a few remedies to partly overcome these limitations are presented. (auth.)

  10. Applying the digital-image-correlation technique to measure the ...

    Indian Academy of Sciences (India)

    4.2 Image analysis. The DIC technique is used to analyse the column deformation. After the position of every mark is traced, two parallel observation lines on the surface of column (as shown in figure 8) are cho- sen. There are 181 equal spaced points on each line. The positions of these points are calculated using B-Spline ...

  11. Bioremediation techniques applied to aqueous media contaminated with mercury.

    Science.gov (United States)

    Velásquez-Riaño, Möritz; Benavides-Otaya, Holman D

    2016-12-01

    In recent years, the environmental and human health impacts of mercury contamination have driven the search for alternative, eco-efficient techniques different from the traditional physicochemical methods for treating this metal. One of these alternative processes is bioremediation. A comprehensive analysis of the different variables that can affect this process is presented. It focuses on determining the effectiveness of different techniques of bioremediation, with a specific consideration of three variables: the removal percentage, time needed for bioremediation and initial concentration of mercury to be treated in an aqueous medium.

  12. Three-dimensional integrated CAE system applying computer graphic technique

    International Nuclear Information System (INIS)

    Kato, Toshisada; Tanaka, Kazuo; Akitomo, Norio; Obata, Tokayasu.

    1991-01-01

    A three-dimensional CAE system for nuclear power plant design is presented. This system utilizes high-speed computer graphic techniques for the plant design review, and an integrated engineering database for handling the large amount of nuclear power plant engineering data in a unified data format. Applying this system makes it possible to construct a nuclear power plant using only computer data from the basic design phase to the manufacturing phase, and it increases the productivity and reliability of the nuclear power plants. (author)

  13. Essentials of applied dynamic analysis

    CERN Document Server

    Jia, Junbo

    2014-01-01

    This book presents up-to-date knowledge of dynamic analysis in engineering world. To facilitate the understanding of the topics by readers with various backgrounds, general principles are linked to their applications from different angles. Special interesting topics such as statistics of motions and loading, damping modeling and measurement, nonlinear dynamics, fatigue assessment, vibration and buckling under axial loading, structural health monitoring, human body vibrations, and vehicle-structure interactions etc., are also presented. The target readers include industry professionals in civil, marine and mechanical engineering, as well as researchers and students in this area.

  14. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  15. Applying field mapping refractive beam shapers to improve holographic techniques

    Science.gov (United States)

    Laskin, Alexander; Williams, Gavin; McWilliam, Richard; Laskin, Vadim

    2012-03-01

    Performance of various holographic techniques can be essentially improved by homogenizing the intensity profile of the laser beam with using beam shaping optics, for example, the achromatic field mapping refractive beam shapers like πShaper. The operational principle of these devices presumes transformation of laser beam intensity from Gaussian to flattop one with high flatness of output wavefront, saving of beam consistency, providing collimated output beam of low divergence, high transmittance, extended depth of field, negligible residual wave aberration, and achromatic design provides capability to work with several laser sources with different wavelengths simultaneously. Applying of these beam shapers brings serious benefits to the Spatial Light Modulator based techniques like Computer Generated Holography or Dot-Matrix mastering of security holograms since uniform illumination of an SLM allows simplifying mathematical calculations and increasing predictability and reliability of the imaging results. Another example is multicolour Denisyuk holography when the achromatic πShaper provides uniform illumination of a field at various wavelengths simultaneously. This paper will describe some design basics of the field mapping refractive beam shapers and optical layouts of their applying in holographic systems. Examples of real implementations and experimental results will be presented as well.

  16. Rare event techniques applied in the Rasmussen study

    International Nuclear Information System (INIS)

    Vesely, W.E.

    1977-01-01

    The Rasmussen Study estimated public risks from commercial nuclear power plant accidents, and therefore the statistics of rare events had to be treated. Two types of rare events were specifically handled, those rare events which were probabilistically rare events and those which were statistically rare events. Four techniques were used to estimate probabilities of rare events. These techniques were aggregating data samples, discretizing ''continuous'' events, extrapolating from minor to catastrophic severities, and decomposing events using event trees and fault trees. In aggregating or combining data the goal was to enlarge the data sample so that the rare event was no longer rare, i.e., so that the enlarged data sample contained one or more occurrences of the event of interest. This aggregation gave rise to random variable treatments of failure rates, occurrence frequencies, and other characteristics estimated from data. This random variable treatment can be interpreted as being comparable to an empirical Bayes technique or a Bayesian technique. In the discretizing event technique, events of a detailed nature were grouped together into a grosser event for purposes of analysis as well as for data collection. The treatment of data characteristics as random variables helped to account for the uncertainties arising from this discretizing. In the severity extrapolation technique a severity variable was associated with each event occurrence for the purpose of predicting probabilities of catastrophic occurrences. Tail behaviors of distributions therefore needed to be considered. Finally, event trees and fault trees were used to express accident occurrences and system failures in terms of more basic events for which data existed. Common mode failures and general dependencies therefore needed to be treated. 2 figures

  17. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  18. Applying AI techniques to improve alarm display effectiveness

    International Nuclear Information System (INIS)

    Gross, J.M.; Birrer, S.A.; Crosberg, D.R.

    1987-01-01

    The Alarm Filtering System (AFS) addresses the problem of information overload in a control room during abnormal operations. Since operators can miss vital information during these periods, systems which emphasize important messages are beneficial. AFS uses the artificial intelligence (AI) technique of object-oriented programming to filter and dynamically prioritize alarm messages. When an alarm's status changes, AFS determines the relative importance of that change according to the current process state. AFS bases that relative importance on relationships the newly changed alarm has with other activated alarms. Evaluations of a alarm importance take place without regard to the activation sequence of alarm signals. The United States Department of Energy has applied for a patent on the approach used in this software. The approach was originally developed by EG and G Idaho for a nuclear reactor control room

  19. Airflow measurement techniques applied to radon mitigation problems

    International Nuclear Information System (INIS)

    Harrje, D.T.; Gadsby, K.J.

    1989-01-01

    During the past decade a multitude of diagnostic procedures associated with the evaluation of air infiltration and air leakage sites have been developed. The spirit of international cooperation and exchange of ideas within the AIC-AIVC conferences has greatly facilitated the adoption and use of these measurement techniques in the countries participating in Annex V. But wide application of such diagnostic methods are not limited to air infiltration alone. The subject of this paper concerns the ways to evaluate and improve radon reduction in buildings using diagnostic methods directly related to developments familiar to the AIVC. Radon problems are certainly not unique to the United States, and the methods described here have to a degree been applied by researchers of other countries faced with similar problems. The radon problem involves more than a harmful pollutant of the living spaces of our buildings -- it also involves energy to operate radon removal equipment and the loss of interior conditioned air as a direct result. The techniques used for air infiltration evaluation will be shown to be very useful in dealing with the radon mitigation challenge. 10 refs., 7 figs., 1 tab

  20. Applying advanced digital signal processing techniques in industrial radioisotopes applications

    International Nuclear Information System (INIS)

    Mahmoud, H.K.A.E.

    2012-01-01

    Radioisotopes can be used to obtain signals or images in order to recognize the information inside the industrial systems. The main problems of using these techniques are the difficulty of identification of the obtained signals or images and the requirement of skilled experts for the interpretation process of the output data of these applications. Now, the interpretation of the output data from these applications is performed mainly manually, depending heavily on the skills and the experience of trained operators. This process is time consuming and the results typically suffer from inconsistency and errors. The objective of the thesis is to apply the advanced digital signal processing techniques for improving the treatment and the interpretation of the output data from the different Industrial Radioisotopes Applications (IRA). This thesis focuses on two IRA; the Residence Time Distribution (RTD) measurement and the defect inspection of welded pipes using a gamma source (gamma radiography). In RTD measurement application, this thesis presents methods for signal pre-processing and modeling of the RTD signals. Simulation results have been presented for two case studies. The first case study is a laboratory experiment for measuring the RTD in a water flow rig. The second case study is an experiment for measuring the RTD in a phosphate production unit. The thesis proposes an approach for RTD signal identification in the presence of noise. In this approach, after signal processing, the Mel Frequency Cepstral Coefficients (MFCCs) and polynomial coefficients are extracted from the processed signal or from one of its transforms. The Discrete Wavelet Transform (DWT), Discrete Cosine Transform (DCT), and Discrete Sine Transform (DST) have been tested and compared for efficient feature extraction. Neural networks have been used for matching of the extracted features. Furthermore, the Power Density Spectrum (PDS) of the RTD signal has been also used instead of the discrete

  1. Hyperspectral imaging based techniques applied to polluted clay characterization

    Science.gov (United States)

    Bonifazi, Giuseppe; Serranti, Silvia

    2006-10-01

    Polluted soils analysis and characterization is one of the basic step to perform in order to collect all the information to design and set-up correct soil reclamation strategies. Soil analysis is usually performed through "in-situ" sampling and laboratory analysis. Such an approach is usually quite expensive and does not allow to reach a direct and detailed knowledge of large areas for the intrinsic limits (high costs) linked to direct sampling and polluting elements detection. As a consequence numerical strategies are applied to extrapolate, starting from a discrete set of data, that is those related to collected samples, information about the contamination level of areas not directly interested by physical sampling. These models are usually very difficult to handle both for the intrinsic variability characterizing the media (soils) and for the high level of interactions between polluting agents, soil characteristics (organic matter content, size class distribution of the inorganic fraction, composition, etc.) and environmental conditions (temperature, humidity, presence of vegetation, human activities, etc.). Aim of this study, starting from previous researches addressed to evaluate the potentialities of hyperspectral imaging approach in polluting soil characterization, was to evaluate the results obtainable in the investigation of an "ad hoc" polluted benthonic clay, usually utilized in rubbish dump, in order to define fast and reliable control strategies addressed to monitor the status of such a material in terms of insulation.

  2. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  3. Innovative Visualization Techniques applied to a Flood Scenario

    Science.gov (United States)

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other

  4. Caldwell University's Department of Applied Behavior Analysis.

    Science.gov (United States)

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis.

  5. Non destructive assay techniques applied to nuclear materials

    International Nuclear Information System (INIS)

    Gavron, A.

    2001-01-01

    Nondestructive assay is a suite of techniques that has matured and become precise, easily implementable, and remotely usable. These techniques provide elaborate safeguards of nuclear material by providing the necessary information for materials accounting. NDA techniques are ubiquitous, reliable, essentially tamper proof, and simple to use. They make the world a safer place to live in, and they make nuclear energy viable. (author)

  6. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  7. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  8. Building an applied activation analysis centre

    International Nuclear Information System (INIS)

    Bartosek, J.; Kasparec, I.; Masek, J.

    1972-01-01

    Requirements are defined and all available background material is reported and discussed for the building up of a centre of applied activation analysis in Czechoslovakia. A detailed analysis of potential users and the centre's envisaged availability is also presented as part of the submitted study. A brief economic analysis is annexed. The study covers the situation up to the end of 1972. (J.K.)

  9. Semantic Data And Visualization Techniques Applied To Geologic Field Mapping

    Science.gov (United States)

    Houser, P. I. Q.; Royo-Leon, M.; Munoz, R.; Estrada, E.; Villanueva-Rosales, N.; Pennington, D. D.

    2015-12-01

    Geologic field mapping involves the use of technology before, during, and after visiting a site. Geologists utilize hardware such as Global Positioning Systems (GPS) connected to mobile computing platforms such as tablets that include software such as ESRI's ArcPad and other software to produce maps and figures for a final analysis and report. Hand written field notes contain important information and drawings or sketches of specific areas within the field study. Our goal is to collect and geo-tag final and raw field data into a cyber-infrastructure environment with an ontology that allows for large data processing, visualization, sharing, and searching, aiding in connecting field research with prior research in the same area and/or aid with experiment replication. Online searches of a specific field area return results such as weather data from NOAA and QuakeML seismic data from USGS. These results that can then be saved to a field mobile device and searched while in the field where there is no Internet connection. To accomplish this we created the GeoField ontology service using the Web Ontology Language (OWL) and Protégé software. Advanced queries on the dataset can be made using reasoning capabilities can be supported that go beyond a standard database service. These improvements include the automated discovery of data relevant to a specific field site and visualization techniques aimed at enhancing analysis and collaboration while in the field by draping data over mobile views of the site using augmented reality. A case study is being performed at University of Texas at El Paso's Indio Mountains Research Station located near Van Horn, Texas, an active multi-disciplinary field study site. The user can interactively move the camera around the study site and view their data digitally. Geologist's can check their data against the site in real-time and improve collaboration with another person as both parties have the same interactive view of the data.

  10. Digital prototyping technique applied for redesigning plastic products

    Science.gov (United States)

    Pop, A.; Andrei, A.

    2015-11-01

    After products are on the market for some time, they often need to be redesigned to meet new market requirements. New products are generally derived from similar but outdated products. Redesigning a product is an important part of the production and development process. The purpose of this paper is to show that using modern technology, like Digital Prototyping in industry is an effective way to produce new products. This paper tries to demonstrate and highlight the effectiveness of the concept of Digital Prototyping, both to reduce the design time of a new product, but also the costs required for implementing this step. The results of this paper show that using Digital Prototyping techniques in designing a new product from an existing one available on the market mould offers a significantly manufacturing time and cost reduction. The ability to simulate and test a new product with modern CAD-CAM programs in all aspects of production (designing of the 3D model, simulation of the structural resistance, analysis of the injection process and beautification) offers a helpful tool for engineers. The whole process can be realised by one skilled engineer very fast and effective.

  11. X-ray fluorescence spectrometry applied to soil analysis

    International Nuclear Information System (INIS)

    Salvador, Vera Lucia Ribeiro; Sato, Ivone Mulako; Scapin Junior, Wilson Santo; Scapin, Marcos Antonio; Imakima, Kengo

    1997-01-01

    This paper studies the X-ray fluorescence spectrometry applied to the soil analysis. A comparative study of the WD-XRFS and ED-XRFS techniques was carried out by using the following soil samples: SL-1, SOIL-7 and marine sediment SD-M-2/TM, from IAEA, and clay, JG-1a from Geological Survey of Japan (GSJ)

  12. Biomechanical study of the funnel technique applied in thoracic ...

    African Journals Online (AJOL)

    Background: Funnel technique is a method used for the insertion of screw into thoracic pedicle. Aim: To evaluate the biomechanical characteristics of thoracic pedicle screw placement using the Funnel technique, trying to provide biomechanical basis for clinical application of this technology. Methods: 14 functional spinal ...

  13. Nuclear and Isotopic Techniques Applied to Nutritional and ...

    African Journals Online (AJOL)

    Nuclear and isotopes methods have been used in industrialized countries to enhance the sensitivity of nutrition and environmental monitoring techniques. The isotope techniques used in nutrition research are: (i) deuterium dilution to measure total body water (TBW) and body composition for evaluating nutritional status, ...

  14. NEW TECHNIQUES APPLIED IN ECONOMICS. ARTIFICIAL NEURAL NETWORK

    Directory of Open Access Journals (Sweden)

    Constantin Ilie

    2009-05-01

    Full Text Available The present paper has the objective to inform the public regarding the use of new techniques for the modeling, simulate and forecast of system from different field of activity. One of those techniques is Artificial Neural Network, one of the artificial in

  15. A Permutation Encoding Technique Applied to Genetic Algorithm ...

    African Journals Online (AJOL)

    In this paper, a permutation chromosome encoding scheme is proposed for obtaining solution to resource constrained project scheduling problem. The proposed chromosome coding method is applied to Genetic algorithm procedure and implemented through object oriented programming. The method is applied to a ...

  16. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  17. Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  18. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    Science.gov (United States)

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  19. Applying decision-making techniques to Civil Engineering Projects

    Directory of Open Access Journals (Sweden)

    Fam F. Abdel-malak

    2017-12-01

    Full Text Available Multi-Criteria Decision-Making (MCDM techniques are found to be useful tools in project managers’ hands to overcome decision-making (DM problems in Civil Engineering Projects (CEPs. The main contribution of this paper includes selecting and studying the popular MCDM techniques that uses different and wide ranges of data types in CEPs. A detailed study including advantages and pitfalls of using the Analytic Hierarchy Process (AHP and Fuzzy Technique for Order of Preference by Similarity to Ideal Solution (Fuzzy TOPSIS is introduced. Those two techniques are selected for the purpose of forming a package that covers most available data types in CEPs. The results indicated that AHP has a structure which simplifies complicated problems, while Fuzzy TOPSIS uses the advantages of linguistic variables to solve the issue of undocumented data and ill-defined problems. Furthermore, AHP is a simple technique that depends on pairwise comparisons of factors and natural attributes, beside it is preferable for widely spread hierarchies. On the other hand, Fuzzy TOPSIS needs more information but works well for the one-tier decision tree as well as it shows more flexibility to work in fuzzy environments. The two techniques have the facility to be integrated and combined in a new module to support most of the decisions required in CEPs. Keywords: Decision-making, AHP, Fuzzy TOPSIS, CBA, Civil Engineering Projects

  20. Diagnostic techniques applied in geostatistics for agricultural data analysis Técnicas de diagnóstico utilizadas em geoestatística para análise de dados agrícolas

    Directory of Open Access Journals (Sweden)

    Joelmir André Borssoi

    2009-12-01

    Full Text Available The structural modeling of spatial dependence, using a geostatistical approach, is an indispensable tool to determine parameters that define this structure, applied on interpolation of values at unsampled points by kriging techniques. However, the estimation of parameters can be greatly affected by the presence of atypical observations in sampled data. The purpose of this study was to use diagnostic techniques in Gaussian spatial linear models in geostatistics to evaluate the sensitivity of maximum likelihood and restrict maximum likelihood estimators to small perturbations in these data. For this purpose, studies with simulated and experimental data were conducted. Results with simulated data showed that the diagnostic techniques were efficient to identify the perturbation in data. The results with real data indicated that atypical values among the sampled data may have a strong influence on thematic maps, thus changing the spatial dependence structure. The application of diagnostic techniques should be part of any geostatistical analysis, to ensure a better quality of the information from thematic maps.A modelagem da estrutura de dependência espacial pela abordagem da geoestatística é de fundamental importância para a definição de parâmetros que definem essa estrutura e que são utilizados na interpolação de valores em locais não amostrados, pela técnica de krigagem. Entretanto, a estimação de parâmetros pode ser muito alterada pela presença de observações atípicas nos dados amostrados. O desenvolvimento deste trabalho teve por objetivo utilizar técnicas de diagnóstico em modelos espaciais lineares gaussianos, empregados em geoestatística, para avaliar a sensibilidade dos estimadores de máxima verossimilhança e máxima verossimilhança restrita a pequenas perturbações nos dados. Foram realizados estudos de dados simulados e experimentais. O estudo com dados simulados mostrou que as técnicas de diagnóstico foram

  1. Object oriented programming techniques applied to device access and control

    International Nuclear Information System (INIS)

    Goetz, A.; Klotz, W.D.; Meyer, J.

    1992-01-01

    In this paper a model, called the device server model, has been presented for solving the problem of device access and control faced by all control systems. Object Oriented Programming techniques were used to achieve a powerful yet flexible solution. The model provides a solution to the problem which hides device dependancies. It defines a software framework which has to be respected by implementors of device classes - this is very useful for developing groupware. The decision to implement remote access in the root class means that device servers can be easily integrated in a distributed control system. A lot of the advantages and features of the device server model are due to the adoption of OOP techniques. The main conclusion that can be drawn from this paper is that 1. the device access and control problem is adapted to being solved with OOP techniques, 2. OOP techniques offer a distinct advantage over traditional programming techniques for solving the device access problem. (J.P.N.)

  2. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    Science.gov (United States)

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic

  3. Applying motivational interviewing techniques to palliative care communication.

    Science.gov (United States)

    Pollak, Kathryn I; Childers, Julie W; Arnold, Robert M

    2011-05-01

    Palliative care relies heavily on communication. Although some guidelines do address difficult communication, less is known about how to handle conversations with patients who express ambivalence or resistance to such care. Clinicians also struggle with how to support patient autonomy when they disagree with patient choices. Motivational Interviewing (MI) techniques may help address these responses. Specifically, MI techniques such as reflective statements and summarizing can help reduce a patient's resistance, resolve patient ambivalence, and support patient autonomy. Not all the MI techniques are applicable, however, in part because palliative care clinicians do not guide patients to make particular choices but, instead, help patients make choices that are consistent with patient values. Some elements from MI can be used to improve the quality and efficacy of palliative care conversations.

  4. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  5. Magnetic Solid Phase Extraction Applied to Food Analysis

    Directory of Open Access Journals (Sweden)

    Israel S. Ibarra

    2015-01-01

    Full Text Available Magnetic solid phase extraction has been used as pretreatment technique for the analysis of several compounds because of its advantages when it is compared with classic methods. This methodology is based on the use of magnetic solids as adsorbents for preconcentration of different analytes from complex matrices. Magnetic solid phase extraction minimizes the use of additional steps such as precipitation, centrifugation, and filtration which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique which were applied in food analysis.

  6. Techniques applied in design optimization of parallel manipulators

    CSIR Research Space (South Africa)

    Modungwa, D

    2011-11-01

    Full Text Available the process of optimization a cumbersome and time-consuming endeavour, especially when the variables are diverse and objective functions are excessively complex. Thus, several techniques devised by researchers to solve the problem are reviewed in this paper....

  7. Flash radiographic technique applied to fuel injector sprays

    International Nuclear Information System (INIS)

    Vantine, H.C.

    1977-01-01

    A flash radiographic technique, using 50 ns exposure times, was used to study the pattern and density distribution of a fuel injector spray. The experimental apparatus and method are described. An 85 kVp flash x-ray generator, designed and fabricated at the Lawrence Livermore Laboratory, is utilized. Radiographic images, recorded on standard x-ray films, are digitized and computer processed

  8. X-diffraction technique applied for nano system metrology

    International Nuclear Information System (INIS)

    Kuznetsov, Alexei Yu.; Machado, Rogerio; Robertis, Eveline de; Campos, Andrea P.C.; Archanjo, Braulio S.; Gomes, Lincoln S.; Achete, Carlos A.

    2009-01-01

    The application of nano materials are fast growing in all industrial sectors, with a strong necessity in nano metrology and normalizing in the nano material area. The great potential of the X-ray diffraction technique in this field is illustrated at the example of metals, metal oxides and pharmaceuticals

  9. Consulting with Parents: Applying Family Systems Concepts and Techniques.

    Science.gov (United States)

    Mullis, Fran; Edwards, Dana

    2001-01-01

    This article describes family systems concepts and techniques that school counselors, as consultants, can use to better understand the family system. The concepts are life cycle transitions and extrafamilial influences, extended family influences, boundaries, parental hierarchy and power, and triangulation. (Contains 39 references.) (GCP)

  10. Eddy current technique applied to automated tube profilometry

    International Nuclear Information System (INIS)

    Dobbeni, D.; Melsen, C. van

    1982-01-01

    The use of eddy current methods in the first totally automated pre-service inspection of the internal diameter of PWR steam generator tubes is described. The technique was developed at Laborelec, the Belgian Laboratory of the Electricity Supply Industry. Details are given of the data acquisition system and of the automated manipulator. Representative tube profiles are illustrated. (U.K.)

  11. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  12. Comparison of analysis techniques for electromyographic data.

    Science.gov (United States)

    Johnson, J C

    1978-01-01

    Electromyography has been effectively employed to estimate the stress encountered by muscles in performing a variety of functions in the static environment. Such analysis provides the basis for modification of a man-machine system in order to optimize the performances of individual tasks by reducing muscle stress. Myriad analysis methods have been proposed and employed to convert raw electromyographic data into numerical indices of stress and, more specifically, muscle work. However, the type of analysis technique applied to the data can significantly affect the outcome of the experiment. In this study, four methods of analysis are employed to simultaneously process electromyographic data from the flexor muscles of the forearm. The methods of analysis include: 1) integrated EMG (three separate time constants), 2) root mean square voltage, 3) peak height discrimination (three level), and 4) turns counting (two methods). Mechanical stress input as applied to the arm of the subjects includes static load and vibration. The results of the study indicate the comparative sensitivity of each of the techniques to changes in EMG resulting from changes in static and dynamic load on the muscle.

  13. Evaluating the function of applied behavior analysis a bibliometric analysis.

    OpenAIRE

    Critchfield, Thomas S

    2002-01-01

    Analysis of scholarly citations involving behavioral journals reveals that, consistent with its mission, applied behavior analysis research frequently references the basic behavioral literature but, as some have suspected, exerts narrow scholarly influence.

  14. Investigation of the shear bond strength to dentin of universal adhesives applied with two different techniques

    Directory of Open Access Journals (Sweden)

    Elif Yaşa

    2017-09-01

    Full Text Available Objective: The aim of this study was to evaluate the shear bond strength of universal adhesives applied with self-etch and etch&rinse techniques to dentin. Materials and Method: Fourty-eight sound extracted human third molars were used in this study. Occlusal enamel was removed in order to expose the dentinal surface, and the surface was flattened. Specimens were randomly divided into four groups and were sectioned vestibulo-lingually using a diamond disc. The universal adhesives: All Bond Universal (Group 1a and 1b, Gluma Bond Universal (Group 2a and 2b and Single Bond Universal (Group 3a and 3b were applied onto the tooth specimens either with self-etch technique (a or with etch&rinse technique (b according to the manufacturers’ instructions. Clearfil SE Bond (Group 4a; self-etch and Optibond FL (Group 4b; etch&rinse were used as control groups. Then the specimens were restored with a nanohybrid composite resin (Filtek Z550. After thermocycling, shear bond strength test was performed with a universal test machine at a crosshead speed of 0.5 mm/min. Fracture analysis was done under a stereomicroscope (×40 magnification. Data were analyzed using two-way ANOVA and post-hoc Tukey tests. Results: Statistical analysis showed significant differences in shear bond strength values between the universal adhesives (p<0.05. Significantly higher bond strength values were observed in self-etch groups (a in comparison to etch&rinse groups (b (p<0.05. Among all groups, Single Bond Universal showed the greatest shear bond strength values, whereas All Bond Universal showed the lowest shear bond strength values with both application techniques. Conclusion: Dentin bonding strengths of universal adhesives applied with different techniques may vary depending on the adhesive material. For the universal bonding agents tested in this study, the etch&rinse technique negatively affected the bond strength to dentin.

  15. Machine-learning techniques applied to antibacterial drug discovery.

    Science.gov (United States)

    Durrant, Jacob D; Amaro, Rommie E

    2015-01-01

    The emergence of drug-resistant bacteria threatens to revert humanity back to the preantibiotic era. Even now, multidrug-resistant bacterial infections annually result in millions of hospital days, billions in healthcare costs, and, most importantly, tens of thousands of lives lost. As many pharmaceutical companies have abandoned antibiotic development in search of more lucrative therapeutics, academic researchers are uniquely positioned to fill the pipeline. Traditional high-throughput screens and lead-optimization efforts are expensive and labor intensive. Computer-aided drug-discovery techniques, which are cheaper and faster, can accelerate the identification of novel antibiotics, leading to improved hit rates and faster transitions to preclinical and clinical testing. The current review describes two machine-learning techniques, neural networks and decision trees, that have been used to identify experimentally validated antibiotics. We conclude by describing the future directions of this exciting field. © 2015 John Wiley & Sons A/S.

  16. [Computerized image analysis applied to urology research].

    Science.gov (United States)

    Urrutia Avisrror, M

    1994-05-01

    Diagnosis with the aid of imaging techniques in urology had developed dramatically over the last few years as a result of using state-of-the-art technology that has added digital angiology to the last generation apparatus for ultrasound. Computerized axial tomography and nuclear magnetic resonance allow very high rates of diagnostic possibilities that only a decade ago were not extended to routine use. Each of these examination procedures has its own limits of sensitivity and specificity which vary as a function of the pathoanatomical characteristics depending on the condition to be explored, although none reaches yet absolute values. With ultrasound, CAT and NMR, identification of the various diseases rely on the analysis of densities although with a significant degree of the examiner's subjectivity in the diagnostic judgement. The logic evolution of these techniques is to eliminate such subjective component and translate the features which characterize each disease in quantifiable parameters, a challenge made feasible by computerized analysis. Thanks to technological advances in the field of microcomputers and the decreased cost of the equipment, currently it is possible for any clinical investigator with average resources to use the most sophisticated imaging analysis techniques for the post-processing of the images obtained, opening in the scope of practical investigation a pathway that just a few years ago was exclusive to only certain organizations due to the high cost involved.

  17. Applying program comprehension techniques to Karel robot programs

    OpenAIRE

    Oliveira, Nuno; Henriques, Pedro; Cruz, Daniela; Pereira, Maria João; Mernik, Marjan; Kosar, Tomaz; Crepinsek, Matej

    2009-01-01

    Abstract—In the context of program understanding, a challenge research topic1 is to learn how techniques and tools for the comprehension of General-Purpose Languages (GPLs) can be used or adjusted to the understanding of Domain-Specific Languages (DSLs). Being DSLs tailored for the description of problems within a specific domain, it becomes easier to improve these tools with specific visualizations (at a higher abstraction level, closer to the problem level) in order to understand the ...

  18. Electrochemical Techniques Applied to Studies of Microbiologically Influenced Corrosion (MIC)

    Science.gov (United States)

    1992-01-01

    corrosion (MIC). Applications Include evaluation of MIC of metals exposed 7, to seawater, fresh water, demineralized water, process chemicals, food stuffs...water, process chemicals, food stuffs, soils, aircraft important to focus elecatrochemical investigations notfuels, human plasma, and sewage. In this...negative than (CONICET-NSF). LaPlata, Argentina, Aquatec, E,., Progress can only be made if surface analytical Quimica , pp. 119-133 techniques are

  19. Applied Data Analysis in Energy Monitoring System

    Directory of Open Access Journals (Sweden)

    Kychkin А.V.

    2016-08-01

    Full Text Available Software and hardware system organization is presented as an example for building energy monitoring of multi-sectional lighting and climate control / conditioning needs. System key feature is applied office energy data analysis that allows to provide each type of hardware localized work mode recognition. It is based on general energy consumption profile with following energy consumption and workload evaluation. Applied data analysis includes primary data processing block, smoothing filter, time stamp identification block, clusterization and classification blocks, state change detection block, statistical data calculation block. Time slot consumed energy value and slot time stamp are taken as work mode classification main parameters. Energy data applied analysis with HIL and OpenJEVis visualization system usage experimental research results for chosen time period has been provided. Energy consumption, workload calculation and eight different states identification has been executed for two lighting sections and one climate control / conditioning emulating system by integral energy consumption profile. Research has been supported by university internal grant №2016/PI-2 «Methodology development of monitoring and heat flow utilization as low potential company energy sources».

  20. Canvas and cosmos: Visual art techniques applied to astronomy data

    Science.gov (United States)

    English, Jayanne

    Bold color images from telescopes act as extraordinary ambassadors for research astronomers because they pique the public’s curiosity. But are they snapshots documenting physical reality? Or are we looking at artistic spacescapes created by digitally manipulating astronomy images? This paper provides a tour of how original black and white data, from all regimes of the electromagnetic spectrum, are converted into the color images gracing popular magazines, numerous websites, and even clothing. The history and method of the technical construction of these images is outlined. However, the paper focuses on introducing the scientific reader to visual literacy (e.g. human perception) and techniques from art (e.g. composition, color theory) since these techniques can produce not only striking but politically powerful public outreach images. When created by research astronomers, the cultures of science and visual art can be balanced and the image can illuminate scientific results sufficiently strongly that the images are also used in research publications. Included are reflections on how they could feedback into astronomy research endeavors and future forms of visualization as well as on the relevance of outreach images to visual art. (See the color online PDF version at http://dx.doi.org/10.1142/S0218271817300105; the figures can be enlarged in PDF viewers.)

  1. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    In this thesis we explore the use of functional data analysis as a method to analyse chemometric data, more specically spectral data in metabolomics. Functional data analysis is a vibrant eld in statistics. It has been rapidly expanding in both methodology and applications since it was made well...... nutritional status and metabolic phenotype. We want to understand how metabolomic spectra can be analysed using functional data analysis to detect the in uence of dierent factors on specic metabolites. These factors can include, for example, gender, diet culture or dietary intervention. In Paper I we apply...... of functional condence intervals for mean curves. We also discuss the many practical considerations in wavelet estimation and thresholding, and the important in uence the choices can have on the resulting estimates. On a conceptual level, the purpose of this thesis is to build a stronger connection between...

  2. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  3. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    Science.gov (United States)

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  4. Neoliberal Optimism: Applying Market Techniques to Global Health.

    Science.gov (United States)

    Mei, Yuyang

    2017-01-01

    Global health and neoliberalism are becoming increasingly intertwined as organizations utilize markets and profit motives to solve the traditional problems of poverty and population health. I use field work conducted over 14 months in a global health technology company to explore how the promise of neoliberalism re-envisions humanitarian efforts. In this company's vaccine refrigerator project, staff members expect their investors and their market to allow them to achieve scale and develop accountability to their users in developing countries. However, the translation of neoliberal techniques to the global health sphere falls short of the ideal, as profits are meager and purchasing power remains with donor organizations. The continued optimism in market principles amidst such a non-ideal market reveals the tenacious ideological commitment to neoliberalism in these global health projects.

  5. Interferogram analysis using Fourier transform techniques

    Science.gov (United States)

    Roddier, Claude; Roddier, Francois

    1987-01-01

    A method of interferogram analysis is described in which Fourier transform techniques are used to map the complex fringe visibility in several types of interferograms. Algorithms are developed for estimation of both the amplitude and the phase of the fringes (yielding the modulus and the phase of the holographically recorded object Fourier transform). The algorithms were applied to the reduction of interferometric seeing measurements (i.e., the estimation of the fringe amplitude only), and the reduction of interferometric tests (i.e., estimation of the fringe phase only). The method was used to analyze scatter-plate interferograms obtained at NOAO.

  6. Dust tracking techniques applied to the STARDUST facility: First results

    Energy Technology Data Exchange (ETDEWEB)

    Malizia, A., E-mail: malizia@ing.uniroma2.it [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Camplani, M. [Grupo de Tratamiento de Imágenes, E.T.S.I de Telecomunicación, Universidad Politécnica de Madrid (Spain); Gelfusa, M. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Lupelli, I. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); EURATOM/CCFE Association, Culham Science Centre, Abingdon (United Kingdom); Richetta, M.; Antonelli, L.; Conetta, F.; Scarpellini, D.; Carestia, M.; Peluso, E.; Bellecci, C. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Salgado, L. [Grupo de Tratamiento de Imágenes, E.T.S.I de Telecomunicación, Universidad Politécnica de Madrid (Spain); Video Processing and Understanding Laboratory, Universidad Autónoma de Madrid (Spain); Gaudio, P. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy)

    2014-10-15

    Highlights: •Use of an experimental facility, STARDUST, to analyze the dust resuspension problem inside the tokamak in case of loss of vacuum accident. •PIV technique implementation to track the dust during a LOVA reproduction inside STARDUST. •Data imaging techniques to analyze dust velocity field: first results and data discussion. -- Abstract: An important issue related to future nuclear fusion reactors fueled with deuterium and tritium is the creation of large amounts of dust due to several mechanisms (disruptions, ELMs and VDEs). The dust size expected in nuclear fusion experiments (such as ITER) is in the order of microns (between 0.1 and 1000 μm). Almost the total amount of this dust remains in the vacuum vessel (VV). This radiological dust can re-suspend in case of LOVA (loss of vacuum accident) and these phenomena can cause explosions and serious damages to the health of the operators and to the integrity of the device. The authors have developed a facility, STARDUST, in order to reproduce the thermo fluid-dynamic conditions comparable to those expected inside the VV of the next generation of experiments such as ITER in case of LOVA. The dust used inside the STARDUST facility presents particle sizes and physical characteristics comparable with those that created inside the VV of nuclear fusion experiments. In this facility an experimental campaign has been conducted with the purpose of tracking the dust re-suspended at low pressurization rates (comparable to those expected in case of LOVA in ITER and suggested by the General Safety and Security Report ITER-GSSR) using a fast camera with a frame rate from 1000 to 10,000 images per second. The velocity fields of the mobilized dust are derived from the imaging of a two-dimensional slice of the flow illuminated by optically adapted laser beam. The aim of this work is to demonstrate the possibility of dust tracking by means of image processing with the objective of determining the velocity field values

  7. Strategic decision analysis applied to borehole seismology

    International Nuclear Information System (INIS)

    Menke, M.M.; Paulsson, B.N.P.

    1994-01-01

    Strategic Decision Analysis (SDA) is the evolving body of knowledge on how to achieve high quality in the decision that shapes an organization's future. SDA comprises philosophy, process concepts, methodology, and tools for making good decisions. It specifically incorporates many concepts and tools from economic evaluation and risk analysis. Chevron Petroleum Technology Company (CPTC) has applied SDA to evaluate and prioritize a number of its most important and most uncertain R and D projects, including borehole seismology. Before SDA, there were significant issues and concerns about the value to CPTC of continuing to work on borehole seismology. The SDA process created a cross-functional team of experts to structure and evaluate this project. A credible economic model was developed, discrete risks and continuous uncertainties were assessed, and an extensive sensitivity analysis was performed. The results, even applied to a very restricted drilling program for a few years, were good enough to demonstrate the value of continuing the project. This paper explains the SDA philosophy concepts, and process and demonstrates the methodology and tools using the borehole seismology project example. SDA is useful in the upstream industry not just in the R and D/technology decisions, but also in major exploration and production decisions. Since a major challenge for upstream companies today is to create and realize value, the SDA approach should have a very broad applicability

  8. Optical Trapping Techniques Applied to the Study of Cell Membranes

    Science.gov (United States)

    Morss, Andrew J.

    Optical tweezers allow for manipulating micron-sized objects using pN level optical forces. In this work, we use an optical trapping setup to aid in three separate experiments, all related to the physics of the cellular membrane. In the first experiment, in conjunction with Brian Henslee, we use optical tweezers to allow for precise positioning and control of cells in suspension to evaluate the cell size dependence of electroporation. Theory predicts that all cells porate at a transmembrane potential VTMof roughly 1 V. The Schwann equation predicts that the transmembrane potential depends linearly on the cell radius r, thus predicting that cells should porate at threshold electric fields that go as 1/r. The threshold field required to induce poration is determined by applying a low voltage pulse to the cell and then applying additional pulses of greater and greater magnitude, checking for poration at each step using propidium iodide dye. We find that, contrary to expectations, cells do not porate at a constant value of the transmembrane potential but at a constant value of the electric field which we find to be 692 V/cm for K562 cells. Delivering precise dosages of nanoparticles into cells is of importance for assessing toxicity of nanoparticles or for genetic research. In the second experiment, we conduct nano-electroporation—a novel method of applying precise doses of transfection agents to cells—by using optical tweezers in conjunction with a confocal microscope to manipulate cells into contact with 100 nm wide nanochannels. This work was done in collaboration with Pouyan Boukany of Dr. Lee's group. The small cross sectional area of these nano channels means that the electric field within them is extremely large, 60 MV/m, which allows them to electrophoretically drive transfection agents into the cell. We find that nano electroporation results in excellent dose control (to within 10% in our experiments) compared to bulk electroporation. We also find that

  9. Schlieren Technique Applied to Magnetohydrodynamic Generator Plasma Torch

    Science.gov (United States)

    Chopra, Nirbhav; Pearcy, Jacob; Jaworski, Michael

    2017-10-01

    Magnetohydrodynamic (MHD) generators are a promising augmentation to current hydrocarbon based combustion schemes for creating electrical power. In recent years, interest in MHD generators has been revitalized due to advances in a number of technologies such as superconducting magnets, solid-state power electronics and materials science as well as changing economics associated with carbon capture, utilization, and sequestration. We use a multi-wavelength schlieren imaging system to evaluate electron density independently of gas density in a plasma torch under conditions relevant to MHD generators. The sensitivity and resolution of the optical system are evaluated alongside the development of an automated analysis and calibration program in Python. Preliminary analysis shows spatial resolutions less than 1mm and measures an electron density of ne = 1 ×1016 cm-3 in an atmospheric microwave torch. Work supported by DOE contract DE-AC02-09CH11466.

  10. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  11. Time-resolved infrared spectroscopic techniques as applied to Channelrhodopsin

    Directory of Open Access Journals (Sweden)

    Eglof eRitter

    2015-07-01

    Full Text Available Among optogenetic tools, channelrhodopsins, the light gated ion channels of the plasma membrane from green algae, play the most important role. Properties like channel selectivity, timing parameters or color can be influenced by the exchange of selected amino acids. Although widely used, in the field of neurosciences for example, there is still little known about their photocycles and the mechanism of ion channel gating and conductance. One of the preferred methods for these studies is infrared spectroscopy since it allows observation of proteins and their function at a molecular level and in near-native environment. The absorption of a photon in channelrhodopsin leads to retinal isomerization within femtoseconds, the conductive states are reached in the microsecond time scale and the return into the fully dark-adapted state may take more than minutes. To be able to cover all these time regimes, a range of different spectroscopical approaches are necessary. This mini-review focuses on time-resolved applications of the infrared technique to study channelrhodopsins and other light triggered proteins. We will discuss the approaches with respect to their suitability to the investigation of channelrhodopsin and related proteins.

  12. [Applying DNA barcoding technique to identify menthae haplocalycis herba].

    Science.gov (United States)

    Pang, Xiaohui; Xu, Haibin; Han, Jianping; Song, Jingyuan

    2012-04-01

    To identify Menthae Haplocalycis Herba and its closely related species using DNA barcoding technique. Total genomic DNA was isolated from Mentha canadensis and its closely related species. Nuclear DNA ITS2 sequences were amplified, and purified PCR products were sequenced. Sequence assembly and consensus sequence generation were performed using the CodonCode Aligner V3.0. The Kimura 2-Parameter (K2P) distances were calculated using software MEGA 5.0. Identification analyses were performed using BLAST1, Nearest Distance and neighbor-joining (NJ) methods. The intra-specific genetic distances of M. canadensis were ranged from 0 to 0.006, which were lower than inter-specific genetic distances between M. canadensis and its closely related species (0.071-0.231). All the three methods showed that ITS2 could discriminate M. canadensis from its closely related species correctly. The ITS2 region is an efficient barcode for identification of Menthae Haplocalycis Herba, which provides a scientific basis for fast and accurate identification of the herb.

  13. A Kalman filter technique applied for medical image reconstruction

    International Nuclear Information System (INIS)

    Goliaei, S.; Ghorshi, S.; Manzuri, M. T.; Mortazavi, M.

    2011-01-01

    Medical images contain information about vital organic tissues inside of human body and are widely used for diagnoses of disease or for surgical purposes. Image reconstruction is essential for medical images for some applications such as suppression of noise or de-blurring the image in order to provide images with better quality and contrast. Due to vital rule of image reconstruction in medical sciences the corresponding algorithms with better efficiency and higher speed is desirable. Most algorithms in image reconstruction are operated on frequency domain such as the most popular one known as filtered back projection. In this paper we introduce a Kalman filter technique which is operated in time domain for medical image reconstruction. Results indicated that as the number of projection increases in both normal collected ray sum and the collected ray sum corrupted by noise the quality of reconstructed image becomes better in terms of contract and transparency. It is also seen that as the number of projection increases the error index decreases.

  14. Applying inversion techniques to understanding nucleus-nucleus potentials

    International Nuclear Information System (INIS)

    Mackintosh, R.S.; Cooper, S.G.

    1996-01-01

    The iterative-perturbative (IP) inversion algorithm makes it possible to determine, essentially uniquely, the complex potential, including spin-orbit component, for spin half particles given the elastic scattering S-matrix S lj . We here report an extension of the method to the determination of energy dependent potentials V(r,E) defined over an energy range for which S lj (E) are provided. This is a natural development of the IP algorithm which has previously been applied to fixed energy, fixed partial wave and the intermediate mixed case inversion. The energy range can include negative energies i.e. V(r,E) can reproduce bound state energies. It can also fit the effective range parameter for low energy scattering. We briefly define the classes of cases which can be studied, outline the IP method itself and briefly review the range of applications. We show the power of the method by presenting nucleon-αV(r,E) for S lj (E) derived from experiments above and below the inelastic threshold and relating them to V(r,E) inverted from S lj (E) for RGM theory. Reference is given to the code IMAGO which embodies the IP algorithm. (author). 38 refs., 5 figs., 4 tabs

  15. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  16. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    studies the `unique chemical ngerprints' (Daviss, 2005) that cellular processes create in living systems. Metabolomics is used to study the in uence of nutrition on the human metabolome. Nutritional metabolomics shows great potential for the discovery of novel biomarkers of food consumption, personal...... nutritional status and metabolic phenotype. We want to understand how metabolomic spectra can be analysed using functional data analysis to detect the in uence of dierent factors on specic metabolites. These factors can include, for example, gender, diet culture or dietary intervention. In Paper I we apply...... care, including personalised nutrition for prevention and treatment....

  17. Comparison between different techniques applied to quartz CPO determination in granitoid mylonites

    Science.gov (United States)

    Fazio, Eugenio; Punturo, Rosalda; Cirrincione, Rosolino; Kern, Hartmut; Wenk, Hans-Rudolph; Pezzino, Antonino; Goswami, Shalini; Mamtani, Manish

    2016-04-01

    Since the second half of the last century, several techniques have been adopted to resolve the crystallographic preferred orientation (CPO) of major minerals constituting crustal and mantle rocks. To this aim, many efforts have been made to increase the accuracy of such analytical devices as well as to progressively reduce the time needed to perform microstructural analysis. It is worth noting that many of these microstructural studies deal with quartz CPO because of the wide occurrence of this mineral phase in crustal rocks as well as its quite simple chemical composition. In the present work, four different techniques were applied to define CPOs of dynamically recrystallized quartz domains from naturally deformed rocks collected from a ductile crustal scale shear zone in order to compare their advantages and limitation. The selected Alpine shear zone is located in the Aspromonte Massif (Calabrian Peloritani Orogen, southern Italy) representing granitoid lithotypes. The adopted methods span from "classical" universal stage (US), to image analysis technique (CIP), electron back-scattered diffraction (EBSD), and time of flight neutron diffraction (TOF). When compared, bulk texture pole figures obtained by means of these different techniques show a good correlation. Advances in analytical techniques used for microstructural investigations are outlined by discussing results of quartz CPO that are presented in this study.

  18. Applying data mining techniques to improve diagnosis in neonatal jaundice

    Directory of Open Access Journals (Sweden)

    Ferreira Duarte

    2012-12-01

    Full Text Available Abstract Background Hyperbilirubinemia is emerging as an increasingly common problem in newborns due to a decreasing hospital length of stay after birth. Jaundice is the most common disease of the newborn and although being benign in most cases it can lead to severe neurological consequences if poorly evaluated. In different areas of medicine, data mining has contributed to improve the results obtained with other methodologies. Hence, the aim of this study was to improve the diagnosis of neonatal jaundice with the application of data mining techniques. Methods This study followed the different phases of the Cross Industry Standard Process for Data Mining model as its methodology. This observational study was performed at the Obstetrics Department of a central hospital (Centro Hospitalar Tâmega e Sousa – EPE, from February to March of 2011. A total of 227 healthy newborn infants with 35 or more weeks of gestation were enrolled in the study. Over 70 variables were collected and analyzed. Also, transcutaneous bilirubin levels were measured from birth to hospital discharge with maximum time intervals of 8 hours between measurements, using a noninvasive bilirubinometer. Different attribute subsets were used to train and test classification models using algorithms included in Weka data mining software, such as decision trees (J48 and neural networks (multilayer perceptron. The accuracy results were compared with the traditional methods for prediction of hyperbilirubinemia. Results The application of different classification algorithms to the collected data allowed predicting subsequent hyperbilirubinemia with high accuracy. In particular, at 24 hours of life of newborns, the accuracy for the prediction of hyperbilirubinemia was 89%. The best results were obtained using the following algorithms: naive Bayes, multilayer perceptron and simple logistic. Conclusions The findings of our study sustain that, new approaches, such as data mining, may support

  19. Applying perceptual and adaptive learning techniques for teaching introductory histopathology

    Directory of Open Access Journals (Sweden)

    Sally Krasne

    2013-01-01

    Full Text Available Background: Medical students are expected to master the ability to interpret histopathologic images, a difficult and time-consuming process. A major problem is the issue of transferring information learned from one example of a particular pathology to a new example. Recent advances in cognitive science have identified new approaches to address this problem. Methods: We adapted a new approach for enhancing pattern recognition of basic pathologic processes in skin histopathology images that utilizes perceptual learning techniques, allowing learners to see relevant structure in novel cases along with adaptive learning algorithms that space and sequence different categories (e.g. diagnoses that appear during a learning session based on each learner′s accuracy and response time (RT. We developed a perceptual and adaptive learning module (PALM that utilized 261 unique images of cell injury, inflammation, neoplasia, or normal histology at low and high magnification. Accuracy and RT were tracked and integrated into a "Score" that reflected students rapid recognition of the pathologies and pre- and post-tests were given to assess the effectiveness. Results: Accuracy, RT and Scores significantly improved from the pre- to post-test with Scores showing much greater improvement than accuracy alone. Delayed post-tests with previously unseen cases, given after 6-7 weeks, showed a decline in accuracy relative to the post-test for 1 st -year students, but not significantly so for 2 nd -year students. However, the delayed post-test scores maintained a significant and large improvement relative to those of the pre-test for both 1 st and 2 nd year students suggesting good retention of pattern recognition. Student evaluations were very favorable. Conclusion: A web-based learning module based on the principles of cognitive science showed an evidence for improved recognition of histopathology patterns by medical students.

  20. Thermal analysis applied to irradiated propolis

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; Mastro, N.L. del E-mail: nelida@usp.br

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were {sup 60}Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600 deg. C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  1. Applying traditional signal processing techniques to social media exploitation for situational understanding

    Science.gov (United States)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  2. Efficient combination of acceleration techniques applied to high frequency methods for solving radiation and scattering problems

    Science.gov (United States)

    Lozano, Lorena; Algar, Ma Jesús; García, Eliseo; González, Iván; Cátedra, Felipe

    2017-12-01

    An improved ray-tracing method applied to high-frequency techniques such as the Uniform Theory of Diffraction (UTD) is presented. The main goal is to increase the speed of the analysis of complex structures while considering a vast number of observation directions and taking into account multiple bounces. The method is based on a combination of the Angular Z-Buffer (AZB), the Space Volumetric Partitioning (SVP) algorithm and the A∗ heuristic search method to treat multiple bounces. In addition, a Master Point strategy was developed to analyze efficiently a large number of Near-Field points or Far-Field directions. This technique can be applied to electromagnetic radiation problems, scattering analysis, propagation at urban or indoor environments and to the mutual coupling between antennas. Due to its efficiency, its application is suitable to study large antennas radiation patterns and even its interactions with complex environments, including satellites, ships, aircrafts, cities or another complex electrically large bodies. The new technique appears to be extremely efficient at these applications even when considering multiple bounces.

  3. Microfluidic Electronic Tongue Applied to Soil Analysis

    Directory of Open Access Journals (Sweden)

    Maria L. Braunger

    2017-04-01

    Full Text Available Precision agriculture is crucial for increasing food output without expanding the cultivable area, which requires sensors to be deployed for controlling the level of nutrients in the soil. In this paper, we report on a microfluidic electronic tongue (e-tongue based on impedance measurements which is capable of distinguishing soil samples enriched with plant macronutrients. The e-tongue setup consisted of an array of sensing units made with layer-by-layer films deposited onto gold interdigitated electrodes. Significantly, the sensing units could be reused with adequate reproducibility after a simple washing procedure, thus indicating that there is no cross-contamination in three independent sets of measurements. A high performance was achieved by treating the capacitance data with the multidimensional projection techniques Principal Component Analysis (PCA, Interactive Document Map (IDMAP, and Sammon’s Mapping. While an optimized performance was demonstrated with IDMAP and feature selection, during which data of a limited frequency range were used, the distinction of all soil samples was also possible with the well-established PCA analysis for measurements at a single frequency. The successful use of a simple microfluidic e-tongue for soil analysis paves the way for enhanced tools to support precision agriculture.

  4. Tissue Microarray Analysis Applied to Bone Diagenesis.

    Science.gov (United States)

    Mello, Rafael Barrios; Silva, Maria Regina Regis; Alves, Maria Teresa Seixas; Evison, Martin Paul; Guimarães, Marco Aurelio; Francisco, Rafaella Arrabaca; Astolphi, Rafael Dias; Iwamura, Edna Sadayo Miazato

    2017-01-04

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens. Standard hematoxylin and eosin, periodic acid-Schiff and silver methenamine, and picrosirius red staining, and CD31 and CD34 immunohistochemistry were applied to TMA sections. Osteocyte and osteocyte lacuna counts, percent bone matrix loss, and fungal spheroid element counts could be measured and collagen fibre bundles observed in all specimens. Decalcification with 7% nitric acid proceeded more rapidly than with 0.5 M EDTA and may offer better preservation of histological and cellular structure. No endothelial cells could be detected using CD31 and CD34 immunohistochemistry. Correlation between osteocytes per lacuna and age at death may reflect reported age-related responses to microdamage. Methodological limitations and caveats, and results of the TMA analysis of post mortem diagenesis in bone are discussed, and implications for DNA survival and recovery considered.

  5. Applied Analysis at MGIMO-University

    Directory of Open Access Journals (Sweden)

    A. A. Orlov

    2014-01-01

    Full Text Available Applied analysis of international relations began to form at MGIMO-University in the 1970s. This kind of research always attracted considerable interest of the Ministry of Foreign Affairs of the USSR, and other executive institutions of the government and received their support. The Ministry of Foreign Affairs initiated the creation of a special unit at MGIMO - the Problem Research Laboratory of Systems Analysis in International Relations. The Laboratory was using system analysis and quantitative methods to produce scientific information for decision-makers to make "more informed decisions in the field of international relations in order to reduce the level of uncertainty in the assessment of the expected impact of these decisions". In 2004, the successor to the Problem Laboratory - Center for International Studies - was transformed into a Research Coordination Council for International Studies, which in 2009 handed its functions to the Institute of International Studies. In comparison with previous periods the Institute of International Studies has significantly increased of research for the Ministry of International Affairs. It has also moved functionally outside its institutional boundaries and produces unclassified research for public offer. It also serves as a place for vivid public discussions among IR specialists. There's also an international recognition of the Institute of International Studies. The "Go to think tanks" international ranking produced annually at the University of Pennsylvania has put MGIMO-University on the 10th place in the category of university based think tanks.

  6. Multivariate analysis applied to tomato hybrid production.

    Science.gov (United States)

    Balasch, S; Nuez, F; Palomares, G; Cuartero, J

    1984-11-01

    Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.

  7. Energy analysis applied to uranium resource estimation

    International Nuclear Information System (INIS)

    Mortimer, N.D.

    1980-01-01

    It is pointed out that fuel prices and ore costs are interdependent, and that in estimating ore costs (involving the cost of fuels used to mine and process the uranium) it is necessary to take into account the total use of energy by the entire fuel system, through the technique of energy analysis. The subject is discussed, and illustrated with diagrams, under the following heads: estimate of how total workable resources would depend on production costs; sensitivity of nuclear electricity prices to ore costs; variation of net energy requirement with ore grade for a typical PWR reactor design; variation of average fundamental cost of nuclear electricity with ore grade; variation of cumulative uranium resources with current maximum ore costs. (U.K.)

  8. Algorithms Design Techniques and Analysis

    CERN Document Server

    Alsuwaiyel, M H

    1999-01-01

    Problem solving is an essential part of every scientific discipline. It has two components: (1) problem identification and formulation, and (2) solution of the formulated problem. One can solve a problem on its own using ad hoc techniques or follow those techniques that have produced efficient solutions to similar problems. This requires the understanding of various algorithm design techniques, how and when to use them to formulate solutions and the context appropriate for each of them. This book advocates the study of algorithm design techniques by presenting most of the useful algorithm desi

  9. Machine learning techniques applied to the determination of road suitability for the transportation of dangerous substances.

    Science.gov (United States)

    Matías, J M; Taboada, J; Ordóñez, C; Nieto, P G

    2007-08-17

    This article describes a methodology to model the degree of remedial action required to make short stretches of a roadway suitable for dangerous goods transport (DGT), particularly pollutant substances, using different variables associated with the characteristics of each segment. Thirty-one factors determining the impact of an accident on a particular stretch of road were identified and subdivided into two major groups: accident probability factors and accident severity factors. Given the number of factors determining the state of a particular road segment, the only viable statistical methods for implementing the model were machine learning techniques, such as multilayer perceptron networks (MLPs), classification trees (CARTs) and support vector machines (SVMs). The results produced by these techniques on a test sample were more favourable than those produced by traditional discriminant analysis, irrespective of whether dimensionality reduction techniques were applied. The best results were obtained using SVMs specifically adapted to ordinal data. This technique takes advantage of the ordinal information contained in the data without penalising the computational load. Furthermore, the technique permits the estimation of the utility function that is latent in expert knowledge.

  10. 31 CFR 205.11 - What requirements apply to funding techniques?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What requirements apply to funding techniques? 205.11 Section 205.11 Money and Finance: Treasury Regulations Relating to Money and Finance... Treasury-State Agreement § 205.11 What requirements apply to funding techniques? (a) A State and a Federal...

  11. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  12. Analysis of the interaction between experimental and applied behavior analysis.

    Science.gov (United States)

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis. © Society for the Experimental Analysis of Behavior.

  13. Case study: how to apply data mining techniques in a healthcare data warehouse.

    Science.gov (United States)

    Silver, M; Sakata, T; Su, H C; Herman, C; Dolins, S B; O'Shea, M J

    2001-01-01

    Healthcare provider organizations are faced with a rising number of financial pressures. Both administrators and physicians need help analyzing large numbers of clinical and financial data when making decisions. To assist them, Rush-Presbyterian-St. Luke's Medical Center and Hitachi America, Ltd. (HAL), Inc., have partnered to build an enterprise data warehouse and perform a series of case study analyses. This article focuses on one analysis, which was performed by a team of physicians and computer science researchers, using a commercially available on-line analytical processing (OLAP) tool in conjunction with proprietary data mining techniques developed by HAL researchers. The initial objective of the analysis was to discover how to use data mining techniques to make business decisions that can influence cost, revenue, and operational efficiency while maintaining a high level of care. Another objective was to understand how to apply these techniques appropriately and to find a repeatable method for analyzing data and finding business insights. The process used to identify opportunities and effect changes is described.

  14. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  15. Validation in Principal Components Analysis Applied to EEG Data

    Directory of Open Access Journals (Sweden)

    João Carlos G. D. Costa

    2014-01-01

    Full Text Available The well-known multivariate technique Principal Components Analysis (PCA is usually applied to a sample, and so component scores are subjected to sampling variability. However, few studies address their stability, an important topic when the sample size is small. This work presents three validation procedures applied to PCA, based on confidence regions generated by a variant of a nonparametric bootstrap called the partial bootstrap: (i the assessment of PC scores variability by the spread and overlapping of “confidence regions” plotted around these scores; (ii the use of the confidence regions centroids as a validation set; and (iii the definition of the number of nontrivial axes to be retained for analysis. The methods were applied to EEG data collected during a postural control protocol with twenty-four volunteers. Two axes were retained for analysis, with 91.6% of explained variance. Results showed that the area of the confidence regions provided useful insights on the variability of scores and suggested that some subjects were not distinguishable from others, which was not evident from the principal planes. In addition, potential outliers, initially suggested by an analysis of the first principal plane, could not be confirmed by the confidence regions.

  16. Nuclear and conventional techniques applied to the analysis of prehispanic metals of the Templo Mayor of Tenochtitlan; Tecnicas nucleares y convencionales aplicadas al analisis de metales prehispanicos del Templo Mayor de Tenochtitlan

    Energy Technology Data Exchange (ETDEWEB)

    Mendez M, U

    2003-07-01

    The use of the such experimental techniques as: PIXE, RBS, Metallography and Sem, applied to the characterization of pre hispanic metals of copper and gold coming from 9 offerings of the Templo Mayor of Tenochtitlan, are possible to obtain results and information sustained on such aspects as technological development and cultural and commercial exchange besides a relative chronology, as well as aspects related with conservation, authenticity, symbolic association and social meaning of the offerings. After way but it specifies, it will be given to know each one of the objectives outlined for this study: To carry out interpretations on technical of factory, stylistic designs and cultural and commercial exchanges starting from aspects like: microstructure, elementary composition, type of alloys, welding existence, golden superficial, and conservation, they can be had. To determine the technological advance that means the prosecution of the metallic materials and to know their location in the archaeological context, as a means for the interpretation of the social significance of the offering. To know the possible association symbolic-religious from the metallic objects offering to the deities; starting from significant characteristics as they are: color, forms and function. To establish if it is possible to know if the devices found in the offerings are of the same temporality in which one carries out this, or at least, to locate to the devices inside the two stages of the development of the metallurgy these they are known as the period of the native copper and the period of the alloys, this helped to determine a relative chronology of when the objects were manufactured. To confirm the authenticity of the devices. To determine, in a way specifies, the conservation grade in that they are the pieces. To corroborate some of the manufacture processes This is achieved by means of the reproduction of objects in laboratory, to establish comparisons and differences among pre

  17. Imaging and pattern recognition techniques applied to particulate solids material characterization in mineral processing

    International Nuclear Information System (INIS)

    Bonifazi, G.; La Marca, F.; Massacci, P.

    1999-01-01

    The characterization of particulate solids can be carried out by chemical and mineralogical analysis, or, in some cases, following a new approach based on the combined use of: i) imaging techniques to detect the surface features of the particles, and ii) pattern recognition procedures, to identify and classify the mineralogical composition on the bases of the previously detected 'pictorial' features. The aim of this methodology is to establish a correlation between image parameters (texture and color) and physical chemical parameters characterizing the set of particles to be evaluated. The technique was applied to characterize the raw-ore coming from a deposit of mineral sands of three different lithotypes. An appropriate number of samples for each lithotype has been collected. A vector of attributes (pattern vector), by either texture and color parameters, has been associated to each sample. Image analysis demonstrated as the selected parameters are quite sensitive to the conditions of image acquisition: in fact optical properties may be strongly influenced by physical condition, in terms of moisture content and optics set-up and lighting conditions. Standard conditions for acquisition have been selected according to the in situ conditions during sampling. To verify the reliability of the proposed methodology, images have been acquired under different conditions of humidity, focusing and illumination. In order to evaluate the influence of these parameters on image pictorial properties, textural analysis procedures have been applied to the image acquired from different samples. Data resulting from the processing have been used for remote controlling of the material fed to the mineral processing plant. (author)

  18. New technique of in-situ soil-moisture sampling for environmental isotope analysis applied at Pilat sand dune near Bordeaux. HETP modelling of bomb tritium propagation in the unsaturated and saturated zones

    International Nuclear Information System (INIS)

    Thoma, G.; Esser, N.; Sonntag, C.; Weiss, W.; Rudolph, J.; Leveque, P.

    1979-01-01

    A new soil-air suction method with soil-water vapour adsorption by a 4-A molecular sieve provides soil-moisture samples from various depths for environmental isotope analysis and yields soil temperature profiles. A field tritium tracer experiment shows that this in-situ sampling method has an isotope profile resolution of about 5-10cm only. Application of this method in the Pilat sand dune (Bordeaux/France) yielded deuterium and tritium profiles down to 25m depth. Bomb tritium measurements of monthly lysimeter percolate samples available since 1961 show that the tritium response has a mean delay of five months in the case of a sand lysimeter and of 2.5 years for a loess loam lysimeter. A simple HETP model simulates the layered downward movement of soil water and the longitudinal dispersion in the lysimeters. Field capacity and evapotranspiration taken as open parameters yield tritium concentration values of the lysimeters' percolate which agree well with the experimental results. Based on local meteorological data the HETP model applied to tritium tracer experiments in the unsaturated zone yields in addition an individual prediction of the momentary tracer position and of the soil-moisture distribution. This prediction can be checked experimentally at selected intervals by coring. (author)

  19. Applied research on air pollution using nuclear-related analytical techniques

    International Nuclear Information System (INIS)

    1994-01-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which will run from 1992-1996, and will build upon the experience gained by the Agency from the laboratory support that it has been providing for several years to BAPMoN - the Background Air Pollution Monitoring Network programme organized under the auspices of the World Meterological Organization. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XFR, and PIXE for the analysis of toxic and other trace elements in suspended particulate matter (including air filter samples), rainwater and fog-water samples, and in biological indicators of air pollution (e.g. lichens and mosses). The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for practically-oriented research and monitoring studies on air pollution ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural areas). This document reports the discussions held during the first Research Co-ordination Meeting (RCM) for the CRP which took place at the IAEA Headquarters in Vienna. Refs, figs and tabs

  20. Colilert® applied to food analysis

    Directory of Open Access Journals (Sweden)

    Maria José Rodrigues

    2014-06-01

    Full Text Available Colilert® (IDEXX was originally developed for the simultaneous enumeration of coliforms and E. coli in water samples and has been used for the quality control routine of drinking, swimming pools, fresh, coastal and waste waters (Grossi et al., 2013. The Colilert® culture medium contains the indicator nutrient 4-Methylumbelliferyl-β-D-Glucuronide (MUG. MUG acts as a substrate for the E. coli enzyme β-glucuronidase, from which a fluorescent compound is produced. A positive MUG result produces fluorescence when viewed under an ultraviolet lamp. If the test fluorescence is equal to or greater than that of the control, the presence of E. coli has been confirmed (Lopez-Roldan et al., 2013. The present work aimed to apply Colilert® to the enumeration of E. coli in different foods, through the comparison of results against the reference method (ISO 16649-2, 2001 for E. coli food analysis. The study was divided in two stages. During the first stage ten different types of foods were analyzed with Colilert®, these included pastry, raw meat, ready to eat meals, yogurt, raw seabream and salmon, and cooked shrimp. From these it were approved the following: pastry with custard; raw minced pork; soup "caldo-verde"; raw vegetable salad (lettuce and carrots and solid yogurt. The approved foods presented a better insertion in the tray, the colour of the wells was lighter and the UV reading was easier. In the second stage the foods were artificially contaminated with 2 log/g of E. coli (ATCC 25922 and analyzed. Colilert® proved to be an accurate method and the counts were similar to the ones obtained with the reference method. In the present study, the Colilert® method did not reveal neither false-positive or false-negative results, however sometimes the results were difficult to read due to the presence of green fluorescence in some wells. Generally Colilert® was an easy and rapid method, but less objective and more expensive than the reference method.

  1. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  2. MULTIVARIATE TECHNIQUES APPLIED TO EVALUATION OF LIGNOCELLULOSIC RESIDUES FOR BIOENERGY PRODUCTION

    Directory of Open Access Journals (Sweden)

    Thiago de Paula Protásio

    2013-12-01

    Full Text Available http://dx.doi.org/10.5902/1980509812361The evaluation of lignocellulosic wastes for bioenergy production demands to consider several characteristicsand properties that may be correlated. This fact demands the use of various multivariate analysis techniquesthat allow the evaluation of relevant energetic factors. This work aimed to apply cluster analysis and principalcomponents analyses for the selection and evaluation of lignocellulosic wastes for bioenergy production.8 types of residual biomass were used, whose the elemental components (C, H, O, N, S content, lignin, totalextractives and ashes contents, basic density and higher and lower heating values were determined. Bothmultivariate techniques applied for evaluation and selection of lignocellulosic wastes were efficient andsimilarities were observed between the biomass groups formed by them. Through the interpretation of thefirst principal component obtained, it was possible to create a global development index for the evaluationof the viability of energetic uses of biomass. The interpretation of the second principal component alloweda contrast between nitrogen and sulfur contents with oxygen content.

  3. Functional analysis in modern applied mathematics

    CERN Document Server

    Curtain, Ruth F

    1977-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  4. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  5. Functional Analysis in Applied Mathematics and Engineering

    DEFF Research Database (Denmark)

    Pedersen, Michael

    1997-01-01

    Lecture notes for the course 01245 Functional Analysis. Consists of the first part of amonograph with the same title.......Lecture notes for the course 01245 Functional Analysis. Consists of the first part of amonograph with the same title....

  6. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  7. Applied modal analysis of wind turbine blades

    DEFF Research Database (Denmark)

    Pedersen, H.B.; Kristensen, O.J.D.

    2003-01-01

    are investigated and the most suitable are chosen. Different excitation techniques are tried during experimental campaigns. After a discussion the pendulum hammer were chosen, and a new improved hammer wasmanufactured. Some measurement errors are investigated. The ability to repeat the measured results...

  8. Applying BI Techniques To Improve Decision Making And Provide Knowledge Based Management

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2015-07-01

    Full Text Available The paper focuses on BI techniques and especially data mining algorithms that can support and improve the decision making process, with applications within the financial sector. We consider the data mining techniques to be more efficient and thus we applied several techniques, supervised and unsupervised learning algorithms The case study in which these algorithms have been implemented regards the activity of a banking institution, with focus on the management of lending activities.

  9. Development of communications analysis techniques

    Science.gov (United States)

    Shelton, R. D.

    1972-01-01

    Major results from the frequency analysis of system program (FASP) are reported. The FASP procedure was designed to analyze or design linear dynamic systems, but can be used to solve any problem that can be described by a system of linear time invariant differential equations. The program also shows plots of performance changes as design parameters are adjusted. Experimental results on narrowband FM distortion are also reported.

  10. Introduction: Conversation Analysis in Applied Linguistics

    Science.gov (United States)

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  11. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  12. On the relation between applied behavior analysis and positive behavioral support.

    Science.gov (United States)

    Carr, James E; Sidener, Tina M

    2002-01-01

    Anderson and Freeman (2000) recently defined positive behavioral support (PBS) as a systematic approach to the delivery of clinical and educational services that is rooted in behavior analysis. However, the recent literature contains varied definitions of PBS as well as discrepant notions regarding the relation between applied behavior analysis and PBS. After summarizing common definitional characteristics of PBS from the literature, we conclude that PBS is comprised almost exclusively of techniques and values originating in applied behavior analysis. We then discuss the relations between applied behavior analysis and PBS that have been proposed in the literature. Finally, we discuss possible implications of considering PBS a field separate from applied behavior analysis.

  13. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  14. Tissue Microarray Analysis Applied to Bone Diagenesis

    OpenAIRE

    Barrios Mello, Rafael; Regis Silva, Maria Regina; Seixas Alves, Maria Teresa; Evison, Martin; Guimarães, Marco Aurélio; Francisco, Rafaella Arrabaça; Dias Astolphi, Rafael; Miazato Iwamura, Edna Sadayo

    2017-01-01

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens....

  15. Operation feedback analysis applied to preventive maintenance

    International Nuclear Information System (INIS)

    Bouchet, J.L.; Havart, J.; Jacquot, J.P.; Lannoy, A.; Vasseur, D.

    1992-03-01

    The paper presents the contribution of operation feedback data bases and their analysis for the optimization of a preventive maintenance program. The RCM approach (Reliability Centered Maintenance) is based on a functional breakdown of systems and components. There are four major stages: analysis of equipment criticality, critical failure analysis for each critical component, selection of maintenance tasks, and operation feedback analysis. The in-depth knowledge of the failure and degradation mechanisms is only available through operation feedback, its analysis and experts judgements. The validation of the information collected in the various feedback files, and its statistical processing coupled with reliability calculations, enables : - the definition of the causes of failure and degradation noted, - the estimation of the associated failure rate and the calculation of the evolution of this failure rate as the age of the component increases. The paper presents the approach used and the results obtained for the pilot study of the chemical and volumetric control system CVCS of 900 MW PWR nuclear power plants. About 2 500 sheets, concerning a lot of equipment types (pumps, sensors, valves, ...) have been expertised and the statistical processing provided basic reliability parameters for RCM (rates, modes, causes, subcomponents, ...). (authors). 5 tabs., 6 figs., 7 refs

  16. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Data analysis techniques for gravitational wave observations. S V Dhurandhar ... The performance of some of these techniques on real data obtained will be discussed. Finally, some results on ... S V Dhurandhar1. Inter-University Centre for Astronomy and Astrophysics, Post Bag 4, Ganeshkhind, Pune 411 007, India ...

  17. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  18. Inverse Filtering Techniques in Speech Analysis | Nwachuku ...

    African Journals Online (AJOL)

    inverse filtering' has been applied. The unifying features of these techniques are presented, namely: 1. a basis in the source-filter theory of speech production, 2. the use of a network whose transfer function is the inverse of the transfer function of ...

  19. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    Science.gov (United States)

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  20. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  1. Current status of neutron activation analysis and applied nuclear chemistry

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1990-01-01

    A review of recent scientometric studies of citations and publication data shows the present state of NAA and applied nuclear chemistry as compared to other analytical techniques. (author) 9 refs.; 7 tabs

  2. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  3. Applied bioinformatics: Genome annotation and transcriptome analysis

    DEFF Research Database (Denmark)

    Gupta, Vikas

    and dhurrin, which have not previously been characterized in blueberries. There are more than 44,500 spider species with distinct habitats and unique characteristics. Spiders are masters of producing silk webs to catch prey and using venom to neutralize. The exploration of the genetics behind these properties...... japonicus (Lotus), Vaccinium corymbosum (blueberry), Stegodyphus mimosarum (spider) and Trifolium occidentale (clover). From a bioinformatics data analysis perspective, my work can be divided into three parts; genome annotation, small RNA, and gene expression analysis. Lotus is a legume of significant...... has just started. We have assembled and annotated the first two spider genomes to facilitate our understanding of spiders at the molecular level. The need for analyzing the large and increasing amount of sequencing data has increased the demand for efficient, user friendly, and broadly applicable...

  4. Applied bioinformatics: Genome annotation and transcriptome analysis

    DEFF Research Database (Denmark)

    Gupta, Vikas

    japonicus (Lotus), Vaccinium corymbosum (blueberry), Stegodyphus mimosarum (spider) and Trifolium occidentale (clover). From a bioinformatics data analysis perspective, my work can be divided into three parts; genome annotation, small RNA, and gene expression analysis. Lotus is a legume of significant...... biology and genetics studies. We present an improved Lotus genome assembly and annotation, a catalog of natural variation based on re-sequencing of 29 accessions, and describe the involvement of small RNAs in the plant-bacteria symbiosis. Blueberries contain anthocyanins, other pigments and various...... polyphenolic compounds, which have been linked to protection against diabetes, cardiovascular disease and age-related cognitive decline. We present the first genome- guided approach in blueberry to identify genes involved in the synthesis of health-protective compounds. Using RNA-Seq data from five stages...

  5. Principal component analysis applied to remote sensing

    Directory of Open Access Journals (Sweden)

    Javier Estornell

    2013-06-01

    Full Text Available The main objective of this article was to show an application of principal component analysis (PCA which is used in two science degrees. Particularly, PCA analysis was used to obtain information of the land cover from satellite images. Three Landsat images were selected from two areas which were located in the municipalities of Gandia and Vallat, both in the Valencia province (Spain. In the first study area, just one Landsat image of the 2005 year was used. In the second study area, two Landsat images were used taken in the 1994 and 2000 years to analyse the most significant changes in the land cover. According to the results, the second principal component of the Gandia area image allowed detecting the presence of vegetation. The same component in the Vallat area allowed detecting a forestry area affected by a forest fire. Consequently in this study we confirmed the feasibility of using PCA in remote sensing to extract land use information.

  6. Applying Communication Theories toward Designing Compliance-Gaining Techniques in Customer Dissatisfaction

    Directory of Open Access Journals (Sweden)

    Jonathan Matusitz

    2011-01-01

    Full Text Available The purpose of this paper is to apply three communication theories (namely, Argumentation Theory, the Foot-in-the-Door Technique, and the Door-in-the-Face Technique to the formulation of complaints that communicate effectively to company employees and yield compensation for the consumer. What the authors demonstrate is that complaining is not a haphazard procedure if communication theories are applied properly. In addition, also emphasized is the importance of self-efficacy, as a psychological component, to illustrate the necessity for complainers to have sufficient and trueself-confidence in order to carry out each of these theories in practice.

  7. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Bak, Sung Ryel; Park, Yong Chul; Kim, Young Ki; Chung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun

    2000-05-01

    This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques.

  8. Database 'catalogue of techniques applied to materials and products of nuclear engineering'

    International Nuclear Information System (INIS)

    Lebedeva, E.E.; Golovanov, V.N.; Podkopayeva, I.A.; Temnoyeva, T.A.

    2002-01-01

    The database 'Catalogue of techniques applied to materials and products of nuclear engineering' (IS MERI) was developed to provide informational support for SSC RF RIAR and other enterprises in scientific investigations. This database contains information on the techniques used at RF Minatom enterprises for reactor material properties investigation. The main purpose of this system consists in the assessment of the current status of the reactor material science experimental base for the further planning of experimental activities and methodical support improvement. (author)

  9. In situ texture analysis under applied load

    International Nuclear Information System (INIS)

    Brokmeier, H.G.

    2005-01-01

    The in-situ measurement of a crystallographic texture is a special type of a non-destructive measurement, which need special equipments. Due to the high photon flux and the excellent brilliance high energetic synchrotron radiations are a fantastic tool particular in fast experimentation. Moreover, a high penetration power allows the investigation of standard tensile sample of the DIN-norm. A loading device with a power up to 20 kN was installed at the hard wiggler beamline BW5 (HASYLAB-DESY) to perform in-situ strain and in-situ texture analysis. Using 100keV X-rays one gets short wavelength so that a 2D image-plate detector offers a wide range of diffraction pattern within the first 10 degree in 2 theta. Thermal neutron is another radiation with a high penetration power, which is the standard method for global texture analysis of bulk samples. As an example rectangular extruded Mg- Az31 was investigated by an in-situ. tensile experiment. Samples with 0 degree, 45 degree and 90 degree to the extrusion direction were cut. In-situ strain studies show the lattice dependent strains perpendicular and parallel to the loading direction. Moreover, in hexagonal Mg-Az31 a strong influence of the initial texture on the tensile behavior can be explained by the combination of texture simulation with in-situ measurements. (author)

  10. Nuclear techniques (PIXE and RBS) applied to analysis of pre hispanic metals of the Templo Mayor at Tenochtitlan; Tecnicas nucleares (PIXE y RBS) aplicadas al analisis de metales prehispanicos del Templo Mayor de Tenochtitlan

    Energy Technology Data Exchange (ETDEWEB)

    Mendez U, I.; Tenorio, D.; Galvan, J.L. [Instituto Nacional de Investigaciones Nucleares, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2000-07-01

    This work has the objective of determining by means of the utilization of nuclear techniques (PIXE and RBS) the composition and the alloy type of diverse aztec ornaments corresponding to Post classic period, they were manufactured principally with copper and gold such as bells, beads and disks; all they belonging at 9 oblations of Templo Mayor of Tenochtitlan. It is presented here briefly the historical and archaeological antecedents of the devices as well as the analytical methods for conclude with the results obtained. (Author)

  11. PIXE analysis applied to characterized water samples

    International Nuclear Information System (INIS)

    Santos, Maristela S.; Carneiro, Luana Gomes; Medeiros, Geiza; Sampaio, Camilla; Martorell, Ana Beatriz Targino; Gouvea, Stella; Cunha, Kenya Moore Dias da

    2011-01-01

    Araxa, in Brazil, is a naturally high background area located in the State of Minas Gerais with a population of about 93 672 people. Araxa is historical city famous for its mineral water sources and mud from Termas de Araxa spa, which have been used for therapeutic, and recreation purposes. Other important aspect of economy of the city are mining and metallurgic industries. In the Araxa area is located the largest deposit of pyrochlore, a niobium mineral, and also a deposit of apatite, a phosphate mineral both containing Th and U associated to crystal lattice. The minerals are obtained from open pit mines, the minerals are processed in industrial also located in city of Araxa, these plants process the pyrochlore and apatite to obtain the Fe-Nb alloy and the concentrate of phosphate, respectively. Studies were developed in this area to assessment the occupational risk of the workers due to exposure to dust particles during the routine working, however very few studies evaluated the water contamination outside the mines in order to determine the metal (stables elements) concentrations in water and also the concentrations of the radionuclides in water. This paper presents the previous results of a study to identify and determine the concentrations of metals (stables elements) and radionuclides in river around the city. The water from these rivers is used as drinking water and irrigation water. The water samples were collected in different rivers around the Araxa city and the samples were analyzed using PIXE technique. A proton beam of 2 MeV obtained from the van de Graaff electrostatic accelerator was used to induce the characteristic X-rays. S, K, Ca, Cr, Mn, Fe, Ni, Zn, Ba, Pb and U were identified in the mass spectrum of the samples. The elemental mass concentrations were compared using a non-parametric statistical test. The results of the statistical test showed that the elemental mass concentrations did not present the same distribution. These results indicated

  12. Toward applied behavior analysis of life aloft

    Science.gov (United States)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  13. Strategies and techniques of communication and public relations applied to non-profit sector

    Directory of Open Access Journals (Sweden)

    Ioana – Julieta Josan

    2010-05-01

    Full Text Available The aim of this paper is to summarize the strategies and techniques of communication and public relations applied to non-profit sector.The approach of the paper is to identify the most appropriate strategies and techniques that non-profit sector can use to accomplish its objectives, to highlight specific differences between the strategies and techniques of the profit and non-profit sectors and to identify potential communication and public relations actions in order to increase visibility among target audience, create brand awareness and to change into positive brand sentiment the target perception about the non-profit sector.

  14. Digital photoelastic analysis applied to implant dentistry

    Science.gov (United States)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  15. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  16. The impact of applying product-modelling techniques in configurator projects

    DEFF Research Database (Denmark)

    Hvam, Lars; Kristjansdottir, Katrin; Shafiee, Sara

    2018-01-01

    This paper aims to increase understanding of the impact of using product-modelling techniques to structure and formalise knowledge in configurator projects. Companies that provide customised products increasingly apply configurators in support of sales and design activities, reaping benefits...... though extant literature has shown the importance of formal modelling techniques, the impact of utilising these techniques remains relatively unknown. Therefore, this article studies three main areas: (1) the impact of using modelling techniques based on Unified Modelling Language (UML), in which...... ability to reduce the number of product variants. This paper contributes to an increased understanding of what companies can gain from using more formalised modelling techniques in configurator projects, and under what circumstances they should be used....

  17. Element selective detection of molecular species applying chromatographic techniques and diode laser atomic absorption spectrometry.

    Science.gov (United States)

    Kunze, K; Zybin, A; Koch, J; Franzke, J; Miclea, M; Niemax, K

    2004-12-01

    Tunable diode laser atomic absorption spectroscopy (DLAAS) combined with separation techniques and atomization in plasmas and flames is presented as a powerful method for analysis of molecular species. The analytical figures of merit of the technique are demonstrated by the measurement of Cr(VI) and Mn compounds, as well as molecular species including halogen atoms, hydrogen, carbon and sulfur.

  18. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  19. Nonbinary quantification technique accounting for myocardial infarct heterogeneity: Feasibility of applying percent infarct mapping in patients.

    Science.gov (United States)

    Mastrodicasa, Domenico; Elgavish, Gabriel A; Schoepf, U Joseph; Suranyi, Pal; van Assen, Marly; Albrecht, Moritz H; De Cecco, Carlo N; van der Geest, Rob J; Hardy, Rayphael; Mantini, Cesare; Griffith, L Parkwood; Ruzsics, Balazs; Varga-Szemes, Akos

    2018-02-15

    Binary threshold-based quantification techniques ignore myocardial infarct (MI) heterogeneity, yielding substantial misquantification of MI. To assess the technical feasibility of MI quantification using percent infarct mapping (PIM), a prototype nonbinary algorithm, in patients with suspected MI. Prospective cohort POPULATION: Patients (n = 171) with suspected MI referred for cardiac MRI. Inversion recovery balanced steady-state free-precession for late gadolinium enhancement (LGE) and modified Look-Locker inversion recovery (MOLLI) T 1 -mapping on a 1.5T system. Infarct volume (IV) and infarct fraction (IF) were quantified by two observers based on manual delineation, binary approaches (2-5 standard deviations [SD] and full-width at half-maximum [FWHM] thresholds) in LGE images, and by applying the PIM algorithm in T 1 and LGE images (PIM T1 ; PIM LGE ). IV and IF were analyzed using repeated measures analysis of variance (ANOVA). Agreement between the approaches was determined with Bland-Altman analysis. Interobserver agreement was assessed by intraclass correlation coefficient (ICC) analysis. MI was observed in 89 (54.9%) patients, and 185 (38%) short-axis slices. IF with 2, 3, 4, 5SDs and FWHM techniques were 15.7 ± 6.6, 13.4 ± 5.6, 11.6 ± 5.0, 10.8 ± 5.2, and 10.0 ± 5.2%, respectively. The 5SD and FWHM techniques had the best agreement with manual IF (9.9 ± 4.8%) determination (bias 1.0 and 0.2%; P = 0.1426 and P = 0.8094, respectively). The 2SD and 3SD algorithms significantly overestimated manual IF (9.9 ± 4.8%; both P < 0.0001). PIM LGE measured significantly lower IF (7.8 ± 3.7%) compared to manual values (P < 0.0001). PIM LGE , however, showed the best agreement with the PIM T1 reference (7.6 ± 3.6%, P = 0.3156). Interobserver agreement was rated good to excellent for IV (ICCs between 0.727-0.820) and fair to good for IF (0.589-0.736). The application of the PIM LGE technique for MI

  20. Recommendations for learners are different: Applying memory-based recommender system techniques to lifelong learning

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2007). Recommendations for learners are different: applying memory-based recommender system techniques to lifelong learning. Paper presented at the SIRTEL workshop at the EC-TEL 2007 Conference. September, 17-20, 2007, Crete, Greece.

  1. English Language Teachers' Perceptions on Knowing and Applying Contemporary Language Teaching Techniques

    Science.gov (United States)

    Sucuoglu, Esen

    2017-01-01

    The aim of this study is to determine the perceptions of English language teachers teaching at a preparatory school in relation to their knowing and applying contemporary language teaching techniques in their lessons. An investigation was conducted of 21 English language teachers at a preparatory school in North Cyprus. The SPSS statistical…

  2. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  3. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  4. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    Science.gov (United States)

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  6. OBIC technique applied to wide bandgap semiconductors from 100 K up to 450 K

    Science.gov (United States)

    Hamad, H.; Planson, D.; Raynaud, C.; Bevilacqua, P.

    2017-05-01

    Wide bandgap semiconductors have recently become more frequently used in the power electronics domain. They are predicted to replace traditional silicon, especially for high voltage and/or high frequency devices. Device design has made a lot of progress in the last two decades. Substrates up to six inches in diameter have now been commercialized with very low defect densities. Such a development is due to continuous studies. Of these studies, those that allow an excess of charge carriers in the space charge region (like OBIC - optical beam induced current, and EBIC - electron beam induced current) are useful to analyze the variation of electric field as a function of the voltage and the beam position. This paper shows the OBIC technique applied to wide bandgap semiconductor-based devices. OBIC cartography gives an image of the electric field in the device, and the analysis of the OBIC signal helps one to determine some characteristics of the semiconductors, like minority carrier lifetime and ionization rates. These are key parameters to predict device switching behavior and breakdown voltage.

  7. Learning mediastinoscopy: the need for education, experience and modern techniques--interdependency of the applied technique and surgeon's training level.

    Science.gov (United States)

    Walles, Thorsten; Friedel, Godehard; Stegherr, Tobias; Steger, Volker

    2013-04-01

    Mediastinoscopy represents the gold standard for invasive mediastinal staging. While learning and teaching the surgical technique are challenging due to the limited accessibility of the operation field, both benefited from the implementation of video-assisted techniques. However, it has not been established yet whether video-assisted mediastinoscopy improves the mediastinal staging in itself. Retrospective single-centre cohort analysis of 657 mediastinoscopies performed at a specialized tertiary care thoracic surgery unit from 1994 to 2006. The number of specimens obtained per procedure and per lymph node station (2, 4, 7, 8 for mediastinoscopy and 2-9 for open lymphadenectomy), the number of lymph node stations examined, sensitivity and negative predictive value with a focus on the technique employed (video-assisted vs standard technique) and the surgeon's experience were calculated. Overall sensitivity was 60%, accuracy was 90% and negative predictive value 88%. With the conventional technique, experience alone improved sensitivity from 49 to 57% and it was predominant at the paratracheal right region (from 62 to 82%). But with the video-assisted technique, experienced surgeons rose sensitivity from 57 to 79% in contrast to inexperienced surgeons who lowered sensitivity from 49 to 33%. We found significant differences concerning (i) the total number of specimens taken, (ii) the amount of lymph node stations examined, (iii) the number of specimens taken per lymph node station and (iv) true positive mediastinoscopies. The video-assisted technique can significantly improve the results of mediastinoscopy. A thorough education on the modern video-assisted technique is mandatory for thoracic surgeons until they can fully exhaust its potential.

  8. Applied research of environmental monitoring using instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Young Sam; Moon, Jong Hwa; Chung, Young Ju

    1997-08-01

    This technical report is written as a guide book for applied research of environmental monitoring using Instrumental Neutron Activation Analysis. The contents are as followings; sampling and sample preparation as a airborne particulate matter, analytical methodologies, data evaluation and interpretation, basic statistical methods of data analysis applied in environmental pollution studies. (author). 23 refs., 7 tabs., 9 figs.

  9. Hands on applied finite element analysis application with ANSYS

    CERN Document Server

    Arslan, Mehmet Ali

    2015-01-01

    Hands on Applied Finite Element Analysis Application with Ansys is truly an extraordinary book that offers practical ways of tackling FEA problems in machine design and analysis. In this book, 35 good selection of example problems have been presented, offering students the opportunity to apply their knowledge to real engineering FEA problem solutions by guiding them with real life hands on experience.

  10. Statistical Techniques Used in Three Applied Linguistics Journals: "Language Learning,""Applied Linguistics" and "TESOL Quarterly," 1980-1986: Implications for Readers and Researchers.

    Science.gov (United States)

    Teleni, Vicki; Baldauf, Richard B., Jr.

    A study investigated the statistical techniques used by applied linguists and reported in three journals, "Language Learning,""Applied Linguistics," and "TESOL Quarterly," between 1980 and 1986. It was found that 47% of the published articles used statistical procedures. In these articles, 63% of the techniques used could be called basic, 28%…

  11. Computer simulation, nuclear techniques and surface analysis

    Directory of Open Access Journals (Sweden)

    Reis, A. D.

    2010-02-01

    Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of 12C and 18O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de 12C y de 18O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.

  12. Influence of applied corneal endothelium image segmentation techniques on the clinical parameters.

    Science.gov (United States)

    Piorkowski, Adam; Nurzynska, Karolina; Gronkowska-Serafin, Jolanta; Selig, Bettina; Boldak, Cezary; Reska, Daniel

    2017-01-01

    The corneal endothelium state is verified on the basis of an in vivo specular microscope image from which the shape and density of cells are exploited for data description. Due to the relatively low image quality resulting from a high magnification of the living, non-stained tissue, both manual and automatic analysis of the data is a challenging task. Although, many automatic or semi-automatic solutions have already been introduced, all of them are prone to inaccuracy. This work presents a comparison of four methods (fully-automated or semi-automated) for endothelial cell segmentation, all of which represent a different approach to cell segmentation; fast robust stochastic watershed (FRSW), KH method, active contours solution (SNAKE), and TOPCON ImageNET. Moreover, an improvement framework is introduced which aims to unify precise cell border location in images pre-processed with differing techniques. Finally, the influence of the selected methods on clinical parameters is examined, both with and without the improvement framework application. The experiments revealed that although the image segmentation approaches differ, the measures calculated for clinical parameters are in high accordance when CV (coefficient of variation), and CVSL (coefficient of variation of cell sides length) are considered. Higher variation was noticed for the H (hexagonality) metric. Utilisation of the improvement framework assured better repeatability of precise endothelial cell border location between the methods while diminishing the dispersion of clinical parameter values calculated for such images. Finally, it was proven statistically that the image processing method applied for endothelial cell analysis does not influence the ability to differentiate between the images using medical parameters. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. A review of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a {open_quotes}sensitivity analysis.{close_quotes} A comprehensive review is presented of more than a dozen sensitivity analysis methods. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.

  14. Applying transactional analysis and personality assessment to improve patient counseling and communication skills.

    Science.gov (United States)

    Lawrence, Lesa

    2007-08-15

    To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients.

  15. Modelling the effects of the sterile insect technique applied to Eldana saccharina Walker in sugarcane

    Directory of Open Access Journals (Sweden)

    L Potgieter

    2012-12-01

    Full Text Available A mathematical model is formulated for the population dynamics of an Eldana saccharina Walker infestation of sugarcane under the influence of partially sterile released insects. The model describes the population growth of and interaction between normal and sterile E.saccharina moths in a temporally variable, but spatially homogeneous environment. The model consists of a deterministic system of difference equations subject to strictly positive initial data. The primary objective of this model is to determine suitable parameters in terms of which the above population growth and interaction may be quantified and according to which E.saccharina infestation levels and the associated sugarcane damage may be measured. Although many models have been formulated in the past describing the sterile insect technique, few of these models describe the technique for Lepidopteran species with more than one life stage and where F1-sterility is relevant. In addition, none of these models consider the technique when fully sterile females and partially sterile males are being released. The model formulated is also the first to describe the technique applied specifically to E.saccharina, and to consider the economic viability of applying the technique to this species. Pertinent decision support is provided to farm managers in terms of the best timing for releases, release ratios and release frequencies.

  16. Independent Component Analysis applied to Ground-based observations

    Science.gov (United States)

    Martins-Filho, Walter; Griffith, Caitlin; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert Thomas

    2018-01-01

    Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulation of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitude smaller. The effects of the terrestrial atmosphere and some of the time-dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analysis (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition, this technique has the advantage of requiring no reference star. Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements

  17. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  18. Renormalization techniques applied to the study of density of states in disordered systems

    International Nuclear Information System (INIS)

    Ramirez Ibanez, J.

    1985-01-01

    A general scheme for real space renormalization of formal scattering theory is presented and applied to the calculation of density of states (DOS) in some finite width systems. This technique is extended in a self-consistent way, to the treatment of disordered and partially ordered chains. Numerical results of moments and DOS are presented in comparison with previous calculations. In addition, a self-consistent theory for the magnetic order problem in a Hubbard chain is derived and a parametric transition is observed. Properties of localization of the electronic states in disordered chains are studied through various decimation averaging techniques and using numerical simulations. (author) [pt

  19. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  20. Fourier Spectroscopy: A Simple Analysis Technique

    Science.gov (United States)

    Oelfke, William C.

    1975-01-01

    Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)

  1. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    OpenAIRE

    Amany AlShawi

    2016-01-01

    Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers...

  2. Best Available Technique (BAT) assessment applied to ACR-1000 waste and heavy water management systems

    International Nuclear Information System (INIS)

    Sachar, M.; Julien, S.; Hau, K.

    2010-01-01

    The ACR-1000 design is the next evolution of the proven CANDU reactor design. One of the key objectives for this project was to systematically apply the As Low As Reasonably Achievable (ALARA) principle to the reactor design. The ACR design team selected the Best Available Technique (BAT) assessment for this purpose to document decisions made during the design of each ACR-1000 waste and heavy water management systems. This paper describes the steps in the BAT assessment that has been applied to the ACR-1000 design. (author)

  3. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  4. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    Energy Technology Data Exchange (ETDEWEB)

    Credille, Jennifer [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States); Owens, Elizabeth [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States)

    2017-10-11

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restricted to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.

  5. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  7. Just-in-Time techniques as applied to hazardous materials management

    OpenAIRE

    Spicer, John S.

    1996-01-01

    Approved for public release; distribution is unlimited This study investigates the feasibility of integrating JIT techniques in the context of hazardous materials management. This study provides a description of JIT, a description of environmental compliance issues and the outgrowth of related HAZMAT policies, and a broad perspective on strategies for applying JIT to HAZMAT management. http://archive.org/details/justintimetechn00spic Lieutenant Commander, United States Navy

  8. Applying data-mining techniques in honeypot analysis

    CSIR Research Space (South Africa)

    Veerasamy, N

    2006-07-01

    Full Text Available information on the bad guys. The information can then be used to protect against threats. [6] 2.1 Good Security Bruce Schneider describes good security as consisting of prevention, detection, and reaction.[5] Production honeypots providing good security... activity and not signatures. However, studying data collected from honeypots can also assist in determining attack signatures that can be implemented in an intrusion detection system. 2.1.3 Reaction A honeypot is useful in large system environments...

  9. The digital geometric phase technique applied to the deformation evaluation of MEMS devices

    International Nuclear Information System (INIS)

    Liu, Z W; Xie, H M; Gu, C Z; Meng, Y G

    2009-01-01

    Quantitative evaluation of the structure deformation of microfabricated electromechanical systems is of importance for the design and functional control of microsystems. In this investigation, a novel digital geometric phase technique was developed to meet the deformation evaluation requirement of microelectromechanical systems (MEMS). The technique is performed on the basis of regular artificial lattices, instead of a natural atom lattice. The regular artificial lattices with a pitch ranging from micrometer to nanometer will be directly fabricated on the measured surface of MEMS devices by using a focused ion beam (FIB). Phase information can be obtained from the Bragg filtered images after fast Fourier transform (FFT) and inverse fast Fourier transform (IFFT) of the scanning electronic microscope (SEM) images. Then the in-plane displacement field and the local strain field related to the phase information will be evaluated. The obtained results show that the technique can be well applied to deformation measurement with nanometer sensitivity and stiction force estimation of a MEMS device

  10. Animal Research in the "Journal of Applied Behavior Analysis"

    Science.gov (United States)

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  11. Dimensional Analysis with space discrimination applied to Fickian difussion phenomena

    International Nuclear Information System (INIS)

    Diaz Sanchidrian, C.; Castans, M.

    1989-01-01

    Dimensional Analysis with space discrimination is applied to Fickian difussion phenomena in order to transform its partial differen-tial equations into ordinary ones, and also to obtain in a dimensionl-ess fom the Ficks second law. (Author)

  12. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  13. Técnicas moleculares aplicadas à microbiologia de alimentos = Molecular techniques applied to food microbiology

    Directory of Open Access Journals (Sweden)

    Eliezer Ávila Gandra

    2008-01-01

    Full Text Available A partir da década de 80, as técnicas moleculares começaram a ser utilizadas como uma alternativa aos métodos fenotípicos, tradicionalmente, utilizados em microbiologia de alimentos. Foi acelerada esta substituição com advento da descoberta da reação em cadeia da polimerase (polymerase chain reaction – PCR. Este artigo tem por objetivo revisar as principais técnicas moleculares utilizadas como ferramentas na microbiologia de alimentos, desde as, inicialmente, desenvolvidas, como a análise do perfil plasmidial, até as mais contemporâneas como o PCR em tempo real, discutindo as características, vantagens e desvantagens destas técnicas, avaliando a potencialidade destas para suprir as limitações das técnicas tradicionais.Beginning in the 1980s, molecular techniques became an alternative to the traditionally used phenotypic methods in food microbiology. With the advent of the polymerase chain reaction technique, this substitution was speed up. This article had as objective to review the main molecular techniques used as tools in food microbiology, from plasmidial profile analysis to contemporary techniques such as the real-time PCR. The characteristics, advantages anddisadvantages of these techniques are discussed, by evaluating the potential of these techniques to overcome the limitations of traditional techniques.

  14. Applying importance-performance analysis to patient safety culture.

    Science.gov (United States)

    Lee, Yii-Ching; Wu, Hsin-Hung; Hsieh, Wan-Lin; Weng, Shao-Jen; Hsieh, Liang-Po; Huang, Chih-Hsuan

    2015-01-01

    The Sexton et al.'s (2006) safety attitudes questionnaire (SAQ) has been widely used to assess staff's attitudes towards patient safety in healthcare organizations. However, to date there have been few studies that discuss the perceptions of patient safety both from hospital staff and upper management. The purpose of this paper is to improve and to develop better strategies regarding patient safety in healthcare organizations. The Chinese version of SAQ based on the Taiwan Joint Commission on Hospital Accreditation is used to evaluate the perceptions of hospital staff. The current study then lies in applying importance-performance analysis technique to identify the major strengths and weaknesses of the safety culture. The results show that teamwork climate, safety climate, job satisfaction, stress recognition and working conditions are major strengths and should be maintained in order to provide a better patient safety culture. On the contrary, perceptions of management and hospital handoffs and transitions are important weaknesses and should be improved immediately. Research limitations/implications - The research is restricted in generalizability. The assessment of hospital staff in patient safety culture is physicians and registered nurses. It would be interesting to further evaluate other staff's (e.g. technicians, pharmacists and others) opinions regarding patient safety culture in the hospital. Few studies have clearly evaluated the perceptions of healthcare organization management regarding patient safety culture. Healthcare managers enable to take more effective actions to improve the level of patient safety by investigating key characteristics (either strengths or weaknesses) that healthcare organizations should focus on.

  15. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  16. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  17. Wire-mesh and ultrasound techniques applied for the characterization of gas-liquid slug flow

    Energy Technology Data Exchange (ETDEWEB)

    Ofuchi, Cesar Y.; Sieczkowski, Wytila Chagas; Neves Junior, Flavio; Arruda, Lucia V.R.; Morales, Rigoberto E.M.; Amaral, Carlos E.F.; Silva, Marco J. da [Federal University of Technology of Parana, Curitiba, PR (Brazil)], e-mails: ofuchi@utfpr.edu.br, wytila@utfpr.edu.br, neves@utfpr.edu.br, lvrarruda@utfpr.edu.br, rmorales@utfpr.edu.br, camaral@utfpr.edu.br, mdasilva@utfpr.edu.br

    2010-07-01

    Gas-liquid two-phase flows are found in a broad range of industrial applications, such as chemical, petrochemical and nuclear industries and quite often determine the efficiency and safety of process and plants. Several experimental techniques have been proposed and applied to measure and quantify two-phase flows so far. In this experimental study the wire-mesh sensor and an ultrasound technique are used and comparatively evaluated to study two-phase slug flows in horizontal pipes. The wire-mesh is an imaging technique and thus appropriated for scientific studies while ultrasound-based technique is robust and non-intrusive and hence well suited for industrial applications. Based on the measured raw data it is possible to extract some specific slug flow parameters of interest such as mean void fraction and characteristic frequency. The experiments were performed in the Thermal Sciences Laboratory (LACIT) at UTFPR, Brazil, in which an experimental two-phase flow loop is available. The experimental flow loop comprises a horizontal acrylic pipe of 26 mm diameter and 9 m length. Water and air were used to produce the two phase flow under controlled conditions. The results show good agreement between the techniques. (author)

  18. ANALYSIS OF COMPUTER AIDED PROCESS PLANNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Salim A. Saleh

    2013-05-01

    Full Text Available Computer Aided Process Planning ( CAPP has been recognized as playing a key role in Computer Integrated Manufacturing ( CIM . It was used as a bridge to link CAD with CAM systems, in order to give the possibility of full integration in agreement with computer engineering to introduce CIM. The benefits of CAPP in the real industrial environment are still to be achieved. Due to different manufacturing applications, many different CAPP systems have been developed. The development of CAPP techniques needs to a summarized classification and a descriptive analysis. This paper presents the most important and famous techniques for the available CAPP systems, which are based on the variant, generative or semi-generative methods, and a descriptive analysis of their application possibilities.

  19. Advanced Imaging Techniques for Multiphase Flows Analysis

    Science.gov (United States)

    Amoresano, A.; Langella, G.; Di Santo, M.; Iodice, P.

    2017-08-01

    Advanced numerical techniques, such as fuzzy logic and neural networks have been applied in this work to digital images acquired on two applications, a centrifugal pump and a stationary spray in order to define, in a stochastic way, the gas-liquid interface evolution. Starting from the numeric matrix representing the image it is possible to characterize geometrical parameters and the time evolution of the jet. The algorithm used works with the fuzzy logic concept to binarize the chromatist of the pixels, depending them, by using the difference of the light scattering for the gas and the liquid phase.. Starting from a primary fixed threshold, the applied technique, can select the ‘gas’ pixel from the ‘liquid’ pixel and so it is possible define the first most probably boundary lines of the spray. Acquiring continuously the images, fixing a frame rate, a most fine threshold can be select and, at the limit, the most probably geometrical parameters of the jet can be detected.

  20. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  1. Accelerometer Data Analysis and Presentation Techniques

    Science.gov (United States)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  2. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  3. The correlated k-distribution technique as applied to the AVHRR channels

    Science.gov (United States)

    Kratz, David P.

    1995-01-01

    Correlated k-distributions have been created to account for the molecular absorption found in the spectral ranges of the five Advanced Very High Resolution Radiometer (AVHRR) satellite channels. The production of the k-distributions was based upon an exponential-sum fitting of transmissions (ESFT) technique which was applied to reference line-by-line absorptance calculations. To account for the overlap of spectral features from different molecular species, the present routines made use of the multiplication transmissivity property which allows for considerable flexibility, especially when altering relative mixing ratios of the various molecular species. To determine the accuracy of the correlated k-distribution technique as compared to the line-by-line procedure, atmospheric flux and heating rate calculations were run for a wide variety of atmospheric conditions. For the atmospheric conditions taken into consideration, the correlated k-distribution technique has yielded results within about 0.5% for both the cases where the satellite spectral response functions were applied and where they were not. The correlated k-distribution's principal advantages is that it can be incorporated directly into multiple scattering routines that consider scattering as well as absorption by clouds and aerosol particles.

  4. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  5. The Significance of Regional Analysis in Applied Geography.

    Science.gov (United States)

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  6. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  7. Software engineering techniques applied to agricultural systems an object-oriented and UML approach

    CERN Document Server

    Papajorgji, Petraq J

    2014-01-01

    Software Engineering Techniques Applied to Agricultural Systems presents cutting-edge software engineering techniques for designing and implementing better agricultural software systems based on the object-oriented paradigm and the Unified Modeling Language (UML). The focus is on the presentation of  rigorous step-by-step approaches for modeling flexible agricultural and environmental systems, starting with a conceptual diagram representing elements of the system and their relationships. Furthermore, diagrams such as sequential and collaboration diagrams are used to explain the dynamic and static aspects of the software system.    This second edition includes: a new chapter on Object Constraint Language (OCL), a new section dedicated to the Model-VIEW-Controller (MVC) design pattern, new chapters presenting details of two MDA-based tools – the Virtual Enterprise and Olivia Nova, and a new chapter with exercises on conceptual modeling.  It may be highly useful to undergraduate and graduate students as t...

  8. Applied methods and techniques for mechatronic systems modelling, identification and control

    CERN Document Server

    Zhu, Quanmin; Cheng, Lei; Wang, Yongji; Zhao, Dongya

    2014-01-01

    Applied Methods and Techniques for Mechatronic Systems brings together the relevant studies in mechatronic systems with the latest research from interdisciplinary theoretical studies, computational algorithm development and exemplary applications. Readers can easily tailor the techniques in this book to accommodate their ad hoc applications. The clear structure of each paper, background - motivation - quantitative development (equations) - case studies/illustration/tutorial (curve, table, etc.) is also helpful. It is mainly aimed at graduate students, professors and academic researchers in related fields, but it will also be helpful to engineers and scientists from industry. Lei Liu is a lecturer at Huazhong University of Science and Technology (HUST), China; Quanmin Zhu is a professor at University of the West of England, UK; Lei Cheng is an associate professor at Wuhan University of Science and Technology, China; Yongji Wang is a professor at HUST; Dongya Zhao is an associate professor at China University o...

  9. Forensic Analysis using Geological and Geochemical Techniques

    Science.gov (United States)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  10. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    Directory of Open Access Journals (Sweden)

    Amany AlShawi

    2016-01-01

    Full Text Available Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers, vendors, data distributors, and others. Further, data objects entered into the single cache system can be extended into 12 components. Database and SPSS modelers can be used to implement the same.

  11. D-stem: a parallel electron diffraction technique applied to nanomaterials.

    Science.gov (United States)

    Ganesh, K J; Kawasaki, M; Zhou, J P; Ferreira, P J

    2010-10-01

    An electron diffraction technique called D-STEM has been developed in a transmission electron microscopy/scanning transmission electron microscopy (TEM/STEM) instrument to obtain spot electron diffraction patterns from nanostructures, as small as ∼3 nm. The electron ray path achieved by configuring the pre- and postspecimen illumination lenses enables the formation of a 1-2 nm near-parallel probe, which is used to obtain bright-field/dark-field STEM images. Under these conditions, the beam can be controlled and accurately positioned on the STEM image, at the nanostructure of interest, while sharp spot diffraction patterns can be simultaneously recorded on the charge-coupled device camera. When integrated with softwares such as GatanTM STEM diffraction imaging and Automated Crystallography for TEM or DigistarTM, NanoMEGAS, the D-STEM technique is very powerful for obtaining automated orientation and phase maps based on diffraction information acquired on a pixel by pixel basis. The versatility of the D-STEM technique is demonstrated by applying this technique to nanoparticles, nanowires, and nano interconnect structures.

  12. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  13. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  14. Signed directed social network analysis applied to group conflict

    DEFF Research Database (Denmark)

    Zheng, Quan; Skillicorn, David; Walther, Olivier

    2015-01-01

    Real-world social networks contain relationships of multiple different types, but this richness is often ignored in graph-theoretic modelling. We show how two recently developed spectral embedding techniques, for directed graphs (relationships are asymmetric) and for signed graphs (relationships...... are both positive and negative), can be combined. This combination is particularly appropriate for intelligence, terrorism, and law enforcement applications. We illustrate by applying the novel embedding technique to datasets describing conflict in North-West Africa, and show how unusual interactions can...

  15. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  16. Clustering Analysis within Text Classification Techniques

    Directory of Open Access Journals (Sweden)

    Madalina ZURINI

    2011-01-01

    Full Text Available The paper represents a personal approach upon the main applications of classification which are presented in the area of knowledge based society by means of methods and techniques widely spread in the literature. Text classification is underlined in chapter two where the main techniques used are described, along with an integrated taxonomy. The transition is made through the concept of spatial representation. Having the elementary elements of geometry and the artificial intelligence analysis, spatial representation models are presented. Using a parallel approach, spatial dimension is introduced in the process of classification. The main clustering methods are described in an aggregated taxonomy. For an example, spam and ham words are clustered and spatial represented, when the concepts of spam, ham and common and linkage word are presented and explained in the xOy space representation.

  17. Negative reinforcement in applied behavior analysis: an emerging technology.

    OpenAIRE

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these area...

  18. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  19. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  20. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  1. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  2. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  3. Zero order and signal processing spectrophotometric techniques applied for resolving interference of metronidazole with ciprofloxacin in their pharmaceutical dosage form.

    Science.gov (United States)

    Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed

    2016-02-05

    Four rapid, simple, accurate and precise spectrophotometric methods were used for the determination of ciprofloxacin in the presence of metronidazole as interference. The methods under study are area under the curve, simultaneous equation in addition to smart signal processing techniques of manipulating ratio spectra namely Savitsky-Golay filters and continuous wavelet transform. All the methods were validated according to the ICH guidelines where accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can therefore be used for the routine analysis of ciprofloxacin in quality-control laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Conference report: summary of the 2010 Applied Pharmaceutical Analysis Conference.

    Science.gov (United States)

    Unger, Steve E

    2011-01-01

    This year, the Applied Pharmaceutical Analysis meeting changed its venue to the Grand Tremont Hotel in Baltimore, MD, USA. Proximity to Washington presented the opportunity to have four speakers from the US FDA. The purpose of the 4-day conference is to provide a forum in which pharmaceutical and CRO scientists can discuss and develop best practices for scientific challenges in bioanalysis and drug metabolism. This year's theme was 'Bioanalytical and Biotransformation Challenges in Meeting Global Regulatory Expectations & New Technologies for Drug Discovery Challenges'. Applied Pharmaceutical Analysis continued its tradition of highlighting new technologies and its impact on drug discovery, drug metabolism and small molecule-regulated bioanalysis. This year, the meeting included an integrated focus on metabolism in drug discovery and development. Middle and large molecule (biotherapeutics) drug development, immunoassay, immunogenicity and biomarkers were also integrated into the forum. Applied Pharmaceutical Analysis offered an enhanced diversity of topics this year while continuing to share experiences of discovering and developing new medicines.

  5. Metal oxide collectors for storing matter technique applied in secondary ion mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Miśnik, Maciej [Institute of Tele and Radio Technology, ul. Ratuszowa 11, 03-450 Warszawa (Poland); Gdańsk University of Technology (Poland); Konarski, Piotr [Institute of Tele and Radio Technology, ul. Ratuszowa 11, 03-450 Warszawa (Poland); Zawada, Aleksander [Institute of Tele and Radio Technology, ul. Ratuszowa 11, 03-450 Warszawa (Poland); Military University of Technology, Warszawa (Poland)

    2016-03-15

    We present results of the use of metal and metal oxide substrates that serve as collectors in ‘storing matter’, the quantitative technique of secondary ion mass spectrometry (SIMS). This technique allows separating the two base processes of secondary ion formation in SIMS. Namely, the process of ion sputtering is separated from the process of ionisation. The technique allows sputtering of the analysed sample and storing the sputtered material, with sub-monolayer coverage, onto a collector surface. Such deposits can be then analysed by SIMS, and as a result, the so called ‘matrix effects’ are significantly reduced. We perform deposition of the sputtered material onto Ti and Cu substrates and also onto metal oxide substrates as molybdenum, titanium, tin and indium oxides. The process of sputtering is carried within the same vacuum chamber where the SIMS analysis of the collected material is performed. For sputtering and SIMS analysis of the deposited material we use 5 keV Ar{sup +} beam of 500 nA. The presented results are obtained with the use of stationary collectors. Here we present a case study of chromium. The obtained results show that the molybdenum and titanium oxide substrates used as collectors increase useful yield by two orders, with respect to such pure elemental collectors as Cu and Ti. Here we define useful yield as a ratio of the number of detected secondary ions during SIMS analysis and the number of atoms sputtered during the deposition process.

  6. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  7. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  8. Risk Analysis Applied in Oil Exploration and Production | Mbanugo ...

    African Journals Online (AJOL)

    The analysis in this work is based on the actual field data obtained from Devon Exploration and Production Inc. The Net Present Value (NPV) and the Expected Monetary Value (EMV) were computed using Excel and Visual Basic to determine the viability of these projects. Although the use of risk management techniques ...

  9. Microstructure analysis using SAXS/USAXS techniques

    International Nuclear Information System (INIS)

    Okuda, Hiroshi; Ochiai, Shojiro

    2010-01-01

    Introduction to small-angle X-ray scattering (SAXS) and ultra small-angle X-ray scattering (USAXS) is presented. SAXS is useful for microstructure analysis of age-hardenable alloys containing precipitates with several to several tens of nanometers in size. On the other hand, USAXS is appropriate to examine much larger microstructural heterogeneities, such as inclusions, voids, and large precipitates whose size is typically around one micrometer. Combining these two scattering methods, and sometimes also with diffractions, it is possible to assess the hierarchical structure of the samples in-situ and nondestructively, ranging from phase identification, quantitative analysis of precipitation structures upto their mesoscopic aggregates, large voids and inclusions. From technical viewpoint, USAXS requires some specific instrumentation for its optics. However, once a reasonable measurement was made, the analysis for the intensity is the same as that for conventional SAXS. In the present article, short introduction of conventional SAXS is presented, and then, the analysis is applied for a couple of USAXS data obtained for well-defined oxide particles whose average diameters are expected to be about 0.3 micrometers. (author)

  10. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  11. Pattern-recognition techniques applied to performance monitoring of the DSS 13 34-meter antenna control assembly

    Science.gov (United States)

    Mellstrom, J. A.; Smyth, P.

    1991-01-01

    The results of applying pattern recognition techniques to diagnose fault conditions in the pointing system of one of the Deep Space network's large antennas, the DSS 13 34-meter structure, are discussed. A previous article described an experiment whereby a neural network technique was used to identify fault classes by using data obtained from a simulation model of the Deep Space Network (DSN) 70-meter antenna system. Described here is the extension of these classification techniques to the analysis of real data from the field. The general architecture and philosophy of an autonomous monitoring paradigm is described and classification results are discussed and analyzed in this context. Key features of this approach include a probabilistic time-varying context model, the effective integration of signal processing and system identification techniques with pattern recognition algorithms, and the ability to calibrate the system given limited amounts of training data. Reported here are recognition accuracies in the 97 to 98 percent range for the particular fault classes included in the experiments.

  12. Animal research in the Journal of Applied Behavior Analysis.

    Science.gov (United States)

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications.

  13. The spread of behavior analysis to the applied fields 1

    OpenAIRE

    Fraley, Lawrence E.

    1981-01-01

    This paper reviews the status of applied behavioral science as it exists in the various behavioral fields and considers the role of the Association for Behavior Analysis in serving those fields. The confounding effects of the traditions of psychology are discussed. Relevant issues are exemplified in the fields of law, communications, psychology, and education, but broader generalization is implied.

  14. Context, Cognition, and Biology in Applied Behavior Analysis.

    Science.gov (United States)

    Morris, Edward K.

    Behavior analysts are having their professional identities challenged by the roles that cognition and biology are said to play in the conduct and outcome of applied behavior analysis and behavior therapy. For cogniphiliacs, cognition and biology are central to their interventions because cognition and biology are said to reflect various processes,…

  15. Progressive-Ratio Schedules and Applied Behavior Analysis

    Science.gov (United States)

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  16. B. F. Skinner's Contributions to Applied Behavior Analysis

    Science.gov (United States)

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  17. Applied Behavior Analysis: Current Myths in Public Education

    Science.gov (United States)

    Fielding, Cheryl; Lowdermilk, John; Lanier, Lauren L.; Fannin, Abigail G.; Schkade, Jennifer L.; Rose, Chad A.; Simpson, Cynthia G.

    2013-01-01

    The effective use of behavior management strategies and related policies continues to be a debated issue in public education. Despite overwhelming evidence espousing the benefits of the implementation of procedures derived from principles based on the science of applied behavior analysis (ABA), educators often indicate many common misconceptions…

  18. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  19. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    Science.gov (United States)

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  20. Estimates of error introduced when one-dimensional inverse heat transfer techniques are applied to multi-dimensional problems

    International Nuclear Information System (INIS)

    Lopez, C.; Koski, J.A.; Razani, A.

    2000-01-01

    A study of the errors introduced when one-dimensional inverse heat conduction techniques are applied to problems involving two-dimensional heat transfer effects was performed. The geometry used for the study was a cylinder with similar dimensions as a typical container used for the transportation of radioactive materials. The finite element analysis code MSC P/Thermal was used to generate synthetic test data that was then used as input for an inverse heat conduction code. Four different problems were considered including one with uniform flux around the outer surface of the cylinder and three with non-uniform flux applied over 360 deg C, 180 deg C, and 90 deg C sections of the outer surface of the cylinder. The Sandia One-Dimensional Direct and Inverse Thermal (SODDIT) code was used to estimate the surface heat flux of all four cases. The error analysis was performed by comparing the results from SODDIT and the heat flux calculated based on the temperature results obtained from P/Thermal. Results showed an increase in error of the surface heat flux estimates as the applied heat became more localized. For the uniform case, SODDIT provided heat flux estimates with a maximum error of 0.5% whereas for the non-uniform cases, the maximum errors were found to be about 3%, 7%, and 18% for the 360 deg C, 180 deg C, and 90 deg C cases, respectively

  1. Determination of hydrogen diffusivity and permeability in W near room temperature applying a tritium tracer technique

    International Nuclear Information System (INIS)

    Ikeda, T.; Otsuka, T.; Tanabe, T.

    2011-01-01

    Tungsten is a primary candidate of plasma facing material in ITER and beyond, owing to its good thermal property and low erosion. But hydrogen solubility and diffusivity near ITER operation temperatures (below 500 K) have scarcely studied. Mainly because its low hydrogen solubility and diffusivity at lower temperatures make the detection of hydrogen quite difficult. We have tried to observe hydrogen plasma driven permeation (PDP) through nickel and tungsten near room temperatures applying a tritium tracer technique, which is extremely sensible to detect tritium diluted in hydrogen. The apparent diffusion coefficients for PDP were determined by permeation lag times at first time, and those for nickel and tungsten were similar or a few times larger than those for gas driven permeation (GDP). The permeation rates for PDP in nickel and tungsten were larger than those for GDP normalized to the same gas pressure about 20 and 5 times larger, respectively.

  2. Vibration monitoring/diagnostic techniques, as applied to reactor coolant pumps

    International Nuclear Information System (INIS)

    Sculthorpe, B.R.; Johnson, K.M.

    1986-01-01

    With the increased awareness of reactor coolant pump (RCP) cracked shafts, brought about by the catastrophic shaft failure at Crystal River number3, Florida Power and Light Company, in conjunction with Bently Nevada Corporation, undertook a test program at St. Lucie Nuclear Unit number2, to confirm the integrity of all four RCP pump shafts. Reactor coolant pumps play a major roll in the operation of nuclear-powered generation facilities. The time required to disassemble and physically inspect a single RCP shaft would be lengthy, monetarily costly to the utility and its customers, and cause possible unnecessary man-rem exposure to plant personnel. When properly applied, vibration instrumentation can increase unit availability/reliability, as well as provide enhanced diagnostic capability. This paper reviews monitoring benefits and diagnostic techniques applicable to RCPs/motor drives

  3. A systematic review of applying modern software engineering techniques to developing robotic systems

    Directory of Open Access Journals (Sweden)

    Claudia Pons

    2012-01-01

    Full Text Available Robots have become collaborators in our daily life. While robotic systems become more and more complex, the need to engineer their software development grows as well. The traditional approaches used in developing these software systems are reaching their limits; currently used methodologies and tools fall short of addressing the needs of such complex software development. Separating robotics’ knowledge from short-cycled implementation technologies is essential to foster reuse and maintenance. This paper presents a systematic review (SLR of the current use of modern software engineering techniques for developing robotic software systems and their actual automation level. The survey was aimed at summarizing existing evidence concerning applying such technologies to the field of robotic systems to identify any gaps in current research to suggest areas for further investigation and provide a background for positioning new research activities.

  4. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  5. Data Analysis Techniques for Ligo Detector Characterization

    Science.gov (United States)

    Valdes Sanchez, Guillermo A.

    Gravitational-wave astronomy is a branch of astronomy which aims to use gravitational waves to collect observational data about astronomical objects and events such as black holes, neutron stars, supernovae, and processes including those of the early universe shortly after the Big Bang. Einstein first predicted gravitational waves in the early century XX, but it was not until Septem- ber 14, 2015, that the Laser Interferometer Gravitational-Wave Observatory (LIGO) directly ob- served the first gravitational waves in history. LIGO consists of two twin detectors, one in Livingston, Louisiana and another in Hanford, Washington. Instrumental and sporadic noises limit the sensitivity of the detectors. Scientists conduct Data Quality studies to distinguish a gravitational-wave signal from the noise, and new techniques are continuously developed to identify, mitigate, and veto unwanted noise. This work presents the application of data analysis techniques, such as Hilbert-Huang trans- form (HHT) and Kalman filtering (KF), in LIGO detector characterization. We investigated the application of HHT to characterize the gravitational-wave signal of the first detection, we also demonstrated the functionality of HHT identifying noise originated from light being scattered by perturbed surfaces, and we estimated thermo-optical aberration using KF. We put particular attention to the scattering origin application, for which a tool was developed to identify disturbed surfaces originating scattering noise. The results reduced considerably the time to search for the scattering surface and helped LIGO commissioners to mitigate the noise.

  6. Nuclear analytical techniques applied to the large scale measurements of atmospheric aerosols in the amazon region

    International Nuclear Information System (INIS)

    Gerab, Fabio

    1996-03-01

    This work presents the characterization of the atmosphere aerosol collected in different places of the Amazon Basin. We studied both the biogenic emission from the forest and the particulate material which is emitted to the atmosphere due to the large scale man-made burning during the dry season. The samples were collected during a three year period at two different locations in the Amazon, namely the Alta Floresta (MT) and Serra do Navio (AP) regions, using stacked unit filters. These regions represent two different atmospheric compositions: the aerosol is dominated by the forest natural biogenic emission at Serra do Navio, while at Alta Floresta it presents an important contribution from the man-made burning during the dry season. At Alta Floresta we took samples in gold in order to characterize mercury emission to the atmosphere related to the gold prospection activity in Amazon. Airplanes were used for aerosol sampling during the 1992 and 1993 dry seasons to characterize the atmospheric aerosol contents from man-made burning in large Amazonian areas. The samples were analyzed using several nuclear analytic techniques: Particle Induced X-ray Emission for the quantitative analysis of trace elements with atomic number above 11; Particle Induced Gamma-ray Emission for the quantitative analysis of Na; and Proton Microprobe was used for the characterization of individual particles of the aerosol. Reflectancy technique was used in the black carbon quantification, gravimetric analysis to determine the total atmospheric aerosol concentration and Cold Vapor Atomic Absorption Spectroscopy for quantitative analysis of mercury in the particulate from the Alta Floresta gold shops. Ionic chromatography was used to quantify ionic contents of aerosols from the fine mode particulate samples from Serra do Navio. Multivariate statistical analysis was used in order to identify and characterize the sources of the atmospheric aerosol present in the sampled regions. (author)

  7. How Can Synchrotron Radiation Techniques Be Applied for Detecting Microstructures in Amorphous Alloys?

    Directory of Open Access Journals (Sweden)

    Gu-Qing Guo

    2015-11-01

    Full Text Available In this work, how synchrotron radiation techniques can be applied for detecting the microstructure in metallic glass (MG is studied. The unit cells are the basic structural units in crystals, though it has been suggested that the co-existence of various clusters may be the universal structural feature in MG. Therefore, it is a challenge to detect microstructures of MG even at the short-range scale by directly using synchrotron radiation techniques, such as X-ray diffraction and X-ray absorption methods. Here, a feasible scheme is developed where some state-of-the-art synchrotron radiation-based experiments can be combined with simulations to investigate the microstructure in MG. By studying a typical MG composition (Zr70Pd30, it is found that various clusters do co-exist in its microstructure, and icosahedral-like clusters are the popular structural units. This is the structural origin where there is precipitation of an icosahedral quasicrystalline phase prior to phase transformation from glass to crystal when heating Zr70Pd30 MG.

  8. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jeffs, S.P., E-mail: s.p.jeffs@swansea.ac.uk [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Lancaster, R.J. [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Garcia, T.E. [IUTA (University Institute of Industrial Technology of Asturias), University of Oviedo, Edificio Departamental Oeste 7.1.17, Campus Universitario, 33203 Gijón (Spain)

    2015-06-11

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k{sub SP} method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results.

  9. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    Science.gov (United States)

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis. PMID:22477993

  10. A numerical study of different projection-based model reduction techniques applied to computational homogenisation

    Science.gov (United States)

    Soldner, Dominic; Brands, Benjamin; Zabihyan, Reza; Steinmann, Paul; Mergheim, Julia

    2017-10-01

    Computing the macroscopic material response of a continuum body commonly involves the formulation of a phenomenological constitutive model. However, the response is mainly influenced by the heterogeneous microstructure. Computational homogenisation can be used to determine the constitutive behaviour on the macro-scale by solving a boundary value problem at the micro-scale for every so-called macroscopic material point within a nested solution scheme. Hence, this procedure requires the repeated solution of similar microscopic boundary value problems. To reduce the computational cost, model order reduction techniques can be applied. An important aspect thereby is the robustness of the obtained reduced model. Within this study reduced-order modelling (ROM) for the geometrically nonlinear case using hyperelastic materials is applied for the boundary value problem on the micro-scale. This involves the Proper Orthogonal Decomposition (POD) for the primary unknown and hyper-reduction methods for the arising nonlinearity. Therein three methods for hyper-reduction, differing in how the nonlinearity is approximated and the subsequent projection, are compared in terms of accuracy and robustness. Introducing interpolation or Gappy-POD based approximations may not preserve the symmetry of the system tangent, rendering the widely used Galerkin projection sub-optimal. Hence, a different projection related to a Gauss-Newton scheme (Gauss-Newton with Approximated Tensors- GNAT) is favoured to obtain an optimal projection and a robust reduced model.

  11. Investigation of finite difference recession computation techniques applied to a nonlinear recession problem

    Energy Technology Data Exchange (ETDEWEB)

    Randall, J D

    1978-03-01

    This report presents comparisons of results of five implicit and explicit finite difference recession computation techniques with results from a more accurate ''benchmark'' solution applied to a simple one-dimensional nonlinear ablation problem. In the comparison problem a semi-infinite solid is subjected to a constant heat flux at its surface and the rate of recession is controlled by the solid material's latent heat of fusion. All thermal properties are assumed constant. The five finite difference methods include three front node dropping schemes, a back node dropping scheme, and a method in which the ablation problem is embedded in an inverse heat conduction problem and no nodes are dropped. Constancy of thermal properties and the semiinfinite and one-dimensional nature of the problem at hand are not necessary assumptions in applying the methods studied to more general problems. The best of the methods studied will be incorporated into APL's Standard Heat Transfer Program.

  12. Application of unsupervised analysis techniques to lung cancer patient data.

    Science.gov (United States)

    Lynch, Chip M; van Berkel, Victor H; Frieboes, Hermann B

    2017-01-01

    This study applies unsupervised machine learning techniques for classification and clustering to a collection of descriptive variables from 10,442 lung cancer patient records in the Surveillance, Epidemiology, and End Results (SEER) program database. The goal is to automatically classify lung cancer patients into groups based on clinically measurable disease-specific variables in order to estimate survival. Variables selected as inputs for machine learning include Number of Primaries, Age, Grade, Tumor Size, Stage, and TNM, which are numeric or can readily be converted to numeric type. Minimal up-front processing of the data enables exploring the out-of-the-box capabilities of established unsupervised learning techniques, with little human intervention through the entire process. The output of the techniques is used to predict survival time, with the efficacy of the prediction representing a proxy for the usefulness of the classification. A basic single variable linear regression against each unsupervised output is applied, and the associated Root Mean Squared Error (RMSE) value is calculated as a metric to compare between the outputs. The results show that self-ordering maps exhibit the best performance, while k-Means performs the best of the simpler classification techniques. Predicting against the full data set, it is found that their respective RMSE values (15.591 for self-ordering maps and 16.193 for k-Means) are comparable to supervised regression techniques, such as Gradient Boosting Machine (RMSE of 15.048). We conclude that unsupervised data analysis techniques may be of use to classify patients by defining the classes as effective proxies for survival prediction.

  13. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  14. Applying the GNSS Volcanic Ash Plume Detection Technique to Consumer Navigation Receivers

    Science.gov (United States)

    Rainville, N.; Palo, S.; Larson, K. M.

    2017-12-01

    Global Navigation Satellite Systems (GNSS) such as the Global Positioning System (GPS) rely on predictably structured and constant power RF signals to fulfill their primary use for navigation and timing. When the received strength of GNSS signals deviates from the expected baseline, it is typically due to a change in the local environment. This can occur when signal reflections from the ground are modified by changes in snow or soil moisture content, as well as by attenuation of the signal from volcanic ash. This effect allows GNSS signals to be used as a source for passive remote sensing. Larson et al. (2017) have developed a detection technique for volcanic ash plumes based on the attenuation seen at existing geodetic GNSS sites. Since these existing networks are relatively sparse, this technique has been extended to use lower cost consumer GNSS receiver chips to enable higher density measurements of volcanic ash. These low-cost receiver chips have been integrated into a fully stand-alone sensor, with independent power, communications, and logging capabilities as part of a Volcanic Ash Plume Receiver (VAPR) network. A mesh network of these sensors transmits data to a local base-station which then streams the data real-time to a web accessible server. Initial testing of this sensor network has uncovered that a different detection approach is necessary when using consumer GNSS receivers and antennas. The techniques to filter and process the lower quality data from consumer receivers will be discussed and will be applied to initial results from a functioning VAPR network installation.

  15. Bioclimatic and vegetation mapping of a topographically complex oceanic island applying different interpolation techniques

    Science.gov (United States)

    Garzón-Machado, Víctor; Otto, Rüdiger; del Arco Aguilar, Marcelino José

    2014-07-01

    Different spatial interpolation techniques have been applied to construct objective bioclimatic maps of La Palma, Canary Islands. Interpolation of climatic data on this topographically complex island with strong elevation and climatic gradients represents a challenge. Furthermore, meteorological stations are not evenly distributed over the island, with few stations at high elevations. We carried out spatial interpolations of the compensated thermicity index (Itc) and the annual ombrothermic Index (Io), in order to obtain appropriate bioclimatic maps by using automatic interpolation procedures, and to establish their relation to potential vegetation units for constructing a climatophilous potential natural vegetation map (CPNV). For this purpose, we used five interpolation techniques implemented in a GIS: inverse distance weighting (IDW), ordinary kriging (OK), ordinary cokriging (OCK), multiple linear regression (MLR) and MLR followed by ordinary kriging of the regression residuals. Two topographic variables (elevation and aspect), derived from a high-resolution digital elevation model (DEM), were included in OCK and MLR. The accuracy of the interpolation techniques was examined by the results of the error statistics of test data derived from comparison of the predicted and measured values. Best results for both bioclimatic indices were obtained with the MLR method with interpolation of the residuals showing the highest R 2 of the regression between observed and predicted values and lowest values of root mean square errors. MLR with correction of interpolated residuals is an attractive interpolation method for bioclimatic mapping on this oceanic island since it permits one to fully account for easily available geographic information but also takes into account local variation of climatic data.

  16. Chromatographic screening techniques in systematic toxicological analysis.

    Science.gov (United States)

    Drummer, O H

    1999-10-15

    A review of techniques used to screen biological specimens for the presence of drugs was conducted with particular reference to systematic toxicological analysis. Extraction systems of both the liquid-liquid and solid-phase type show little apparent difference in their relative ability to extract a range of drugs according to their physio-chemical properties, although mixed-phase SPE extraction is a preferred technique for GC-based applications, and liquid-liquid were preferred for HPLC-based applications. No one chromatographic system has been shown to be capable of detecting a full range of common drugs of abuse, and common ethical drugs, hence two or more assays are required for laboratories wishing to cover a reasonably comprehensive range of drugs of toxicological significance. While immunoassays are invariably used to screen for drugs of abuse, chromatographic systems relying on derivatization and capable of extracting both acidic and basic drugs would be capable of screening a limited range of targeted drugs. Drugs most difficult to detect in systematic toxicological analysis include LSD, psilocin, THC and its metabolites, fentanyl and its designer derivatives, some potent opiates, potent benzodiazepines and some potent neuroleptics, many of the newer anti-convulsants, alkaloids colchicine, amantins, aflatoxins, antineoplastics, coumarin-based anti-coagulants, and a number of cardiovascular drugs. The widespread use of LC-MS and LC-MS-MS for specific drug detection and the emergence of capillary electrophoresis linked to MS and MS-MS provide an exciting possibility for the future to increase the range of drugs detected in any one chromatographic screening system.

  17. Reduction and analysis techniques for infrared imaging data

    Science.gov (United States)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  18. Space Econometry: new techniques for the regional analysis. An application to the European regions

    Directory of Open Access Journals (Sweden)

    Esther Vayá Valcarcel

    2002-01-01

    Full Text Available The goal of this paper is helping in the diffusion in our country of the techniques given by Spatial Econometrics, offering an overview of its theoretical and applied aspects. Specifically, we introduce the two types of spatial data analysis: the exploratory analysis, focused on a univariate study, as well as the study of spatial autocorrelation in a regression model. Additionally, we apply some of the techniques to the case of the European Union regions.

  19. Discrimination and classification techniques applied on Mallotus and Phyllanthus high performance liquid chromatography fingerprints.

    Science.gov (United States)

    Viaene, J; Goodarzi, M; Dejaegher, B; Tistaert, C; Hoang Le Tuan, A; Nguyen Hoai, N; Chau Van, M; Quetin-Leclercq, J; Vander Heyden, Y

    2015-06-02

    Mallotus and Phyllanthus genera, both containing several species commonly used as traditional medicines around the world, are the subjects of this discrimination and classification study. The objective of this study was to compare different discrimination and classification techniques to distinguish the two genera (Mallotus and Phyllanthus) on the one hand, and the six species (Mallotus apelta, Mallotus paniculatus, Phyllanthus emblica, Phyllanthus reticulatus, Phyllanthus urinaria L. and Phyllanthus amarus), on the other. Fingerprints of 36 samples from the 6 species were developed using reversed-phase high-performance liquid chromatography with ultraviolet detection (RP-HPLC-UV). After fingerprint data pretreatment, first an exploratory data analysis was performed using Principal Component Analysis (PCA), revealing two outlying samples, which were excluded from the calibration set used to develop the discrimination and classification models. Models were built by means of Linear Discriminant Analysis (LDA), Quadratic Discriminant Analysis (QDA), Classification and Regression Trees (CART) and Soft Independent Modeling of Class Analogy (SIMCA). Application of the models on the total data set (outliers included) confirmed a possible labeling issue for the outliers. LDA, QDA and CART, independently of the pretreatment, or SIMCA after "normalization and column centering (N_CC)" or after "Standard Normal Variate transformation and column centering (SNV_CC)" were found best to discriminate the two genera, while LDA after column centering (CC), N_CC or SNV_CC; QDA after SNV_CC; and SIMCA after N_CC or after SNV_CC best distinguished between the 6 species. As classification technique, SIMCA after N_CC or after SNV_CC results in the best overall sensitivity and specificity. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. 2D and 3D optical diagnostic techniques applied to Madonna dei Fusi by Leonardo da Vinci

    Science.gov (United States)

    Fontana, R.; Gambino, M. C.; Greco, M.; Marras, L.; Materazzi, M.; Pampaloni, E.; Pelagotti, A.; Pezzati, L.; Poggi, P.; Sanapo, C.

    2005-06-01

    3D measurement and modelling have been traditionally applied to statues, buildings, archeological sites or similar large structures, but rarely to paintings. Recently, however, 3D measurements have been performed successfully also on easel paintings, allowing to detect and document the painting's surface. We used 3D models to integrate the results of various 2D imaging techniques on a common reference frame. These applications show how the 3D shape information, complemented with 2D colour maps as well as with other types of sensory data, provide the most interesting information. The 3D data acquisition was carried out by means of two devices: a high-resolution laser micro-profilometer, composed of a commercial distance meter mounted on a scanning device, and a laser-line scanner. The 2D data acquisitions were carried out using a scanning device for simultaneous RGB colour imaging and IR reflectography, and a UV fluorescence multispectral image acquisition system. We present here the results of the techniques described, applied to the analysis of an important painting of the Italian Reinassance: `Madonna dei Fusi', attributed to Leonardo da Vinci.

  1. The x-rays fluorescence applied to the analysis of alloys

    International Nuclear Information System (INIS)

    Gutierrez, D.A.

    1997-01-01

    This work is based on the utilization of the Fluorescence of X Rays. This technique of non destructive trial, has the purpose to establish a routine method, for the control of the conformation of industrial samples used. It makes an analysis with a combination of the algorithms of Rasberry-Heinrich and Claisse-Thinh. Besides, the numerical implementation of non usual techniques in this type of analysis. Such as the Linear Programming applied to the solution of super determined systems, of equations and the utilization of methods of relaxation to facilitate the convergence to the solutions. (author) [es

  2. Applied Fourier analysis from signal processing to medical imaging

    CERN Document Server

    Olson, Tim

    2017-01-01

    The first of its kind, this focused textbook serves as a self-contained resource for teaching from scratch the fundamental mathematics of Fourier analysis and illustrating some of its most current, interesting applications, including medical imaging and radar processing. Developed by the author from extensive classroom teaching experience, it provides a breadth of theory that allows students to appreciate the utility of the subject, but at as accessible a depth as possible. With myriad applications included, this book can be adapted to a one or two semester course in Fourier Analysis or serve as the basis for independent study. Applied Fourier Analysis assumes no prior knowledge of analysis from its readers, and begins by making the transition from linear algebra to functional analysis. It goes on to cover basic Fourier series and Fourier transforms before delving into applications in sampling and interpolation theory, digital communications, radar processing, medical i maging, and heat and wave equations. Fo...

  3. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  4. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  5. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  6. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  7. Maintenance Audit through Value Analysis Technique: A Case Study

    Science.gov (United States)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  8. Visual Evaluation Techniques for Skill Analysis.

    Science.gov (United States)

    Brown, Eugene W.

    1982-01-01

    Visual evaluation techniques provide the kinesiologist with a method of evaluating physical skill performance. The techniques are divided into five categories: (1) vantage point; (2) movement simplification; (3) balance and stability; (4) movement relationships; and (5) range of movement. (JN)

  9. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    Science.gov (United States)

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists.

  10. Applied research on air pollution using nuclear-related analytical techniques. Report on the second research co-ordination meeting

    International Nuclear Information System (INIS)

    1995-01-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which started in 1992, and is scheduled to run until early 1997. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XRF, and PIXE for the analysis of toxic and other trace elements in air particulate matter. The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for research and monitoring studies on air pollution, ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural area). This document reports the discussions held during the second Research Co-ordination Meeting (RCM) for the CRP which took place at ANSTO in Menai, Australia. (author)

  11. Modern structure of methods and techniques of marketing research, applied by the world and Ukrainian research companies

    Directory of Open Access Journals (Sweden)

    Bezkrovnaya Yulia

    2015-08-01

    Full Text Available The article presents the results of empiric justification of the structure of methods and techniques of marketing research of consumer decisions, applied by the world and Ukrainian research companies.

  12. New analytical techniques for cuticle chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schulten, H.R. [Fachhochschule Fresenius, Dept. of Trace Analysis, Wiesbaden (Germany)

    1994-12-31

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author`s integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  13. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  14. Applying machine-learning techniques to Twitter data for automatic hazard-event classification.

    Science.gov (United States)

    Filgueira, R.; Bee, E. J.; Diaz-Doce, D.; Poole, J., Sr.; Singh, A.

    2017-12-01

    The constant flow of information offered by tweets provides valuable information about all sorts of events at a high temporal and spatial resolution. Over the past year we have been analyzing in real-time geological hazards/phenomenon, such as earthquakes, volcanic eruptions, landslides, floods or the aurora, as part of the GeoSocial project, by geo-locating tweets filtered by keywords in a web-map. However, not all the filtered tweets are related with hazard/phenomenon events. This work explores two classification techniques for automatic hazard-event categorization based on tweets about the "Aurora". First, tweets were filtered using aurora-related keywords, removing stop words and selecting the ones written in English. For classifying the remaining between "aurora-event" or "no-aurora-event" categories, we compared two state-of-art techniques: Support Vector Machine (SVM) and Deep Convolutional Neural Networks (CNN) algorithms. Both approaches belong to the family of supervised learning algorithms, which make predictions based on labelled training dataset. Therefore, we created a training dataset by tagging 1200 tweets between both categories. The general form of SVM is used to separate two classes by a function (kernel). We compared the performance of four different kernels (Linear Regression, Logistic Regression, Multinomial Naïve Bayesian and Stochastic Gradient Descent) provided by Scikit-Learn library using our training dataset to build the SVM classifier. The results shown that the Logistic Regression (LR) gets the best accuracy (87%). So, we selected the SVM-LR classifier to categorise a large collection of tweets using the "dispel4py" framework.Later, we developed a CNN classifier, where the first layer embeds words into low-dimensional vectors. The next layer performs convolutions over the embedded word vectors. Results from the convolutional layer are max-pooled into a long feature vector, which is classified using a softmax layer. The CNN's accuracy

  15. Applying DEA sensitivity analysis to efficiency measurement of Vietnamese universities

    Directory of Open Access Journals (Sweden)

    Thi Thanh Huyen Nguyen

    2015-11-01

    Full Text Available The primary purpose of this study is to measure the technical efficiency of 30 doctorate-granting universities, the universities or the higher education institutes with PhD training programs, in Vietnam, applying the sensitivity analysis of data envelopment analysis (DEA. The study uses eight sets of input-output specifications using the replacement as well as aggregation/disaggregation of variables. The measurement results allow us to examine the sensitivity of the efficiency of these universities with the sets of variables. The findings also show the impact of variables on their efficiency and its “sustainability”.

  16. Situational Awareness Applied to Geology Field Mapping using Integration of Semantic Data and Visualization Techniques

    Science.gov (United States)

    Houser, P. I. Q.

    2017-12-01

    21st century earth science is data-intensive, characterized by heterogeneous, sometimes voluminous collections representing phenomena at different scales collected for different purposes and managed in disparate ways. However, much of the earth's surface still requires boots-on-the-ground, in-person fieldwork in order to detect the subtle variations from which humans can infer complex structures and patterns. Nevertheless, field experiences can and should be enabled and enhanced by a variety of emerging technologies. The goal of the proposed research project is to pilot test emerging data integration, semantic and visualization technologies for evaluation of their potential usefulness in the field sciences, particularly in the context of field geology. The proposed project will investigate new techniques for data management and integration enabled by semantic web technologies, along with new techniques for augmented reality that can operate on such integrated data to enable in situ visualization in the field. The research objectives include: Develop new technical infrastructure that applies target technologies to field geology; Test, evaluate, and assess the technical infrastructure in a pilot field site; Evaluate the capabilities of the systems for supporting and augmenting field science; and Assess the generality of the system for implementation in new and different types of field sites. Our hypothesis is that these technologies will enable what we call "field science situational awareness" - a cognitive state formerly attained only through long experience in the field - that is highly desirable but difficult to achieve in time- and resource-limited settings. Expected outcomes include elucidation of how, and in what ways, these technologies are beneficial in the field; enumeration of the steps and requirements to implement these systems; and cost/benefit analyses that evaluate under what conditions the investments of time and resources are advisable to construct

  17. Electrochemical microfluidic chip based on molecular imprinting technique applied for therapeutic drug monitoring.

    Science.gov (United States)

    Liu, Jiang; Zhang, Yu; Jiang, Min; Tian, Liping; Sun, Shiguo; Zhao, Na; Zhao, Feilang; Li, Yingchun

    2017-05-15

    In this work, a novel electrochemical detection platform was established by integrating molecularly imprinting technique with microfluidic chip and applied for trace measurement of three therapeutic drugs. The chip foundation is acrylic panel with designed grooves. In the detection cell of the chip, a Pt wire is used as the counter electrode and reference electrode, and a Au-Ag alloy microwire (NPAMW) with 3D nanoporous surface modified with electro-polymerized molecularly imprinted polymer (MIP) film as the working electrode. Detailed characterization of the chip and the working electrode was performed, and the properties were explored by cyclic voltammetry and electrochemical impedance spectroscopy. Two methods, respectively based on electrochemical catalysis and MIP/gate effect were employed for detecting warfarin sodium by using the prepared chip. The linearity of electrochemical catalysis method was in the range of 5×10 -6 -4×10 -4 M, which fails to meet clinical testing demand. By contrast, the linearity of gate effect was 2×10 -11 -4×10 -9 M with remarkably low detection limit of 8×10 -12 M (S/N=3), which is able to satisfy clinical assay. Then the system was applied for 24-h monitoring of drug concentration in plasma after administration of warfarin sodium in rabbit, and the corresponding pharmacokinetic parameters were obtained. In addition, the microfluidic chip was successfully adopted to analyze cyclophosphamide and carbamazepine, implying its good versatile ability. It is expected that this novel electrochemical microfluidic chip can act as a promising format for point-of-care testing via monitoring different analytes sensitively and conveniently. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Harmonic and applied analysis from groups to signals

    CERN Document Server

    Mari, Filippo; Grohs, Philipp; Labate, Demetrio

    2015-01-01

    This contributed volume explores the connection between the theoretical aspects of harmonic analysis and the construction of advanced multiscale representations that have emerged in signal and image processing. It highlights some of the most promising mathematical developments in harmonic analysis in the last decade brought about by the interplay among different areas of abstract and applied mathematics. This intertwining of ideas is considered starting from the theory of unitary group representations and leading to the construction of very efficient schemes for the analysis of multidimensional data. After an introductory chapter surveying the scientific significance of classical and more advanced multiscale methods, chapters cover such topics as An overview of Lie theory focused on common applications in signal analysis, including the wavelet representation of the affine group, the Schrödinger representation of the Heisenberg group, and the metaplectic representation of the symplectic group An introduction ...

  19. Applying Squeezing Technique to Clayrocks: Lessons Learned from Experiments at Mont Terri Rock Laboratory

    International Nuclear Information System (INIS)

    Fernandez, A. M.; Sanchez-Ledesma, D. M.; Tournassat, C.; Melon, A.; Gaucher, E.; Astudillo, E.; Vinsot, A.

    2013-01-01

    Knowledge of the pore water chemistry in clay rock formations plays an important role in determining radionuclide migration in the context of nuclear waste disposal. Among the different in situ and ex-situ techniques for pore water sampling in clay sediments and soils, squeezing technique dates back 115 years. Although different studies have been performed about the reliability and representativeness of squeezed pore waters, more of them were achieved on high porosity, high water content and unconsolidated clay sediments. A very few of them tackled the analysis of squeezed pore water from low-porosity, low water content and highly consolidated clay rocks. In this work, a specially designed and fabricated one-dimensional compression cell two directional fluid flow was used to extract and analyse the pore water composition of Opalinus Clay core samples from Mont Terri (Switzerland). The reproducibility of the technique is good and no ionic ultrafiltration, chemical fractionation or anion exclusion was found in the range of pressures analysed: 70-200 MPa. Pore waters extracted in this range of pressures do not decrease in concentration, which would indicate a dilution of water by mixing of the free pore water and the outer layers of double layer water (Donnan water). A threshold (safety) squeezing pressure of 175 MPa was established for avoiding membrane effects (ion filtering, anion exclusion, etc.) from clay particles induced by increasing pressures. Besides, the pore waters extracted at these pressures are representative of the Opalinus Clay formation from a direct comparison against in situ collected borehole waters. (Author)

  20. Applied Protein and Molecular Techniques for Characterization of B Cell Neoplasms in Horses

    Science.gov (United States)

    Badial, Peres R.; Tallmadge, Rebecca L.; Miller, Steven; Stokol, Tracy; Richards, Kristy; Borges, Alexandre S.

    2015-01-01

    Mature B cell neoplasms cover a spectrum of diseases involving lymphoid tissues (lymphoma) or blood (leukemia), with an overlap between these two presentations. Previous studies describing equine lymphoid neoplasias have not included analyses of clonality using molecular techniques. The objective of this study was to use molecular techniques to advance the classification of B cell lymphoproliferative diseases in five adult equine patients with a rare condition of monoclonal gammopathy, B cell leukemia, and concurrent lymphadenopathy (lymphoma/leukemia). The B cell neoplasms were phenotypically characterized by gene and cell surface molecule expression, secreted immunoglobulin (Ig) isotype concentrations, Ig heavy-chain variable (IGHV) region domain sequencing, and spectratyping. All five patients had hyperglobulinemia due to IgG1 or IgG4/7 monoclonal gammopathy. Peripheral blood leukocyte immunophenotyping revealed high proportions of IgG1- or IgG4/7-positive cells and relative T cell lymphopenia. Most leukemic cells lacked the surface B cell markers CD19 and CD21. IGHG1 or IGHG4/7 gene expression was consistent with surface protein expression, and secreted isotype and Ig spectratyping revealed one dominant monoclonal peak. The mRNA expression of the B cell-associated developmental genes EBF1, PAX5, and CD19 was high compared to that of the plasma cell-associated marker CD38. Sequence analysis of the IGHV domain of leukemic cells revealed mutated Igs. In conclusion, the protein and molecular techniques used in this study identified neoplastic cells compatible with a developmental transition between B cell and plasma cell stages, and they can be used for the classification of equine B cell lymphoproliferative disease. PMID:26311245

  1. Applying Squeezing Technique to Clayrocks: Lessons Learned from Experiments at Mont Terri Rock Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, A. M.; Sanchez-Ledesma, D. M.; Tournassat, C.; Melon, A.; Gaucher, E.; Astudillo, E.; Vinsot, A.

    2013-07-01

    Knowledge of the pore water chemistry in clay rock formations plays an important role in determining radionuclide migration in the context of nuclear waste disposal. Among the different in situ and ex-situ techniques for pore water sampling in clay sediments and soils, squeezing technique dates back 115 years. Although different studies have been performed about the reliability and representativeness of squeezed pore waters, more of them were achieved on high porosity, high water content and unconsolidated clay sediments. A very few of them tackled the analysis of squeezed pore water from low-porosity, low water content and highly consolidated clay rocks. In this work, a specially designed and fabricated one-dimensional compression cell two directional fluid flow was used to extract and analyse the pore water composition of Opalinus Clay core samples from Mont Terri (Switzerland). The reproducibility of the technique is good and no ionic ultrafiltration, chemical fractionation or anion exclusion was found in the range of pressures analysed: 70-200 MPa. Pore waters extracted in this range of pressures do not decrease in concentration, which would indicate a dilution of water by mixing of the free pore water and the outer layers of double layer water (Donnan water). A threshold (safety) squeezing pressure of 175 MPa was established for avoiding membrane effects (ion filtering, anion exclusion, etc.) from clay particles induced by increasing pressures. Besides, the pore waters extracted at these pressures are representative of the Opalinus Clay formation from a direct comparison against in situ collected borehole waters. (Author)

  2. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  3. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  4. The simulation of Typhoon-induced coastal inundation in Busan, South Korea applying the downscaling technique

    Science.gov (United States)

    Jang, Dongmin; Park, Junghyun; Yuk, Jin-Hee; Joh, MinSu

    2017-04-01

    Due to typhoons, the south coastal cities including Busan in South Korea coastal are very vulnerable to a surge, wave and corresponding coastal inundation, and are affected every year. In 2016, South Korea suffered tremendous damage by typhoon 'Chaba', which was developed near east-north of Guam on Sep. 28 and had maximum 10-minute sustained wind speed of about 50 m/s, 1-minute sustained wind speed of 75 m/s and a minimum central pressure of 905 hpa. As 'Chaba', which is the strongest since typhoon 'Maemi' in 2003, hit South Korea on Oct. 5, it caused a massive economic and casualty damage to Ulsan, Gyeongju and Busan in South Korea. In particular, the damage of typhoon-induced coastal inundation in Busan, where many high-rise buildings and residential areas are concentrated near coast, was serious. The coastal inundation could be more affected by strong wind-induced wave than surge. In fact, it was observed that the surge height was about 1 m averagely and a significant wave height was about 8 m at coastal sea nearby Busan on Oct. 5 due to 'Chaba'. Even though the typhoon-induced surge elevated the sea level, the typhoon-induced long period wave with wave period of more than 15s could play more important role in the inundation. The present work simulated the coastal inundation induced by 'Chaba' in Busan, South Korea considering the effects of typhoon-induced surge and wave. For 'Chaba' hindcast, high resolution Weather Research and Forecasting model (WRF) was applied using a reanalysis data produced by NCEP (FNL 0.25 degree) on the boundary and initial conditions, and was validated by the observation of wind speed, direction and pressure. The typhoon-induced coastal inundation was simulated by an unstructured gird model, Finite Volume Community Ocean Model (FVCOM), which is fully current-wave coupled model. To simulate the wave-induced inundation, 1-way downscaling technique of multi domain was applied. Firstly, a mother's domain including Korean peninsula was

  5. Applying value stream mapping techniques to eliminate non-value-added waste for the procurement of endovascular stents

    International Nuclear Information System (INIS)

    Teichgräber, Ulf K.; Bucourt, Maximilian de

    2012-01-01

    Objectives: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). Materials and methods: The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. Results: The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. Conclusion: VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system.

  6. Applying value stream mapping techniques to eliminate non-value-added waste for the procurement of endovascular stents.

    Science.gov (United States)

    Teichgräber, Ulf K; de Bucourt, Maximilian

    2012-01-01

    OJECTIVES: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  7. Advanced examination techniques applied to the qualification of critical welds for the ITER correction coils

    CERN Document Server

    Sgobba, Stefano; Libeyre, Paul; Marcinek, Dawid Jaroslaw; Piguiet, Aline; Cécillon, Alexandre

    2015-01-01

    The ITER correction coils (CCs) consist of three sets of six coils located in between the toroidal (TF) and poloidal field (PF) magnets. The CCs rely on a Cable-in-Conduit Conductor (CICC), whose supercritical cooling at 4.5 K is provided by helium inlets and outlets. The assembly of the nozzles to the stainless steel conductor conduit includes fillet welds requiring full penetration through the thickness of the nozzle. Static and cyclic stresses have to be sustained by the inlet welds during operation. The entire volume of helium inlet and outlet welds, that are submitted to the most stringent quality levels of imperfections according to standards in force, is virtually uninspectable with sufficient resolution by conventional or computed radiography or by Ultrasonic Testing. On the other hand, X-ray computed tomography (CT) was successfully applied to inspect the full weld volume of several dozens of helium inlet qualification samples. The extensive use of CT techniques allowed a significant progress in the ...

  8. The Study of Mining Activities and their Influences in the Almaden Region Applying Remote Sensing Techniques

    International Nuclear Information System (INIS)

    Rico, C.; Schmid, T.; Millan, R.; Gumuzzio, J.

    2010-01-01

    This scientific-technical report is a part of an ongoing research work carried out by Celia Rico Fraile in order to obtain the Diploma of Advanced Studies as part of her PhD studies. This work has been developed in collaboration with the Faculty of Science at The Universidad Autonoma de Madrid and the Department of Environment at CIEMAT. The main objective of this work was the characterization and classification of land use in Almaden (Ciudad Real) during cinnabar mineral exploitation and after mining activities ceased in 2002, developing a methodology focused on the integration of remote sensing techniques applying multispectral and hyper spectral satellite data. By means of preprocessing and processing of data from the satellite images as well as data obtained from field campaigns, a spectral library was compiled in order to obtain representative land surfaces within the study area. Monitoring results show that the distribution of areas affected by mining activities is rapidly diminishing in recent years. (Author) 130 refs

  9. Applying Toyota production system techniques for medication delivery: improving hospital safety and efficiency.

    Science.gov (United States)

    Newell, Terry L; Steinmetz-Malato, Laura L; Van Dyke, Deborah L

    2011-01-01

    The inpatient medication delivery system used at a large regional acute care hospital in the Midwest had become antiquated and inefficient. The existing 24-hr medication cart-fill exchange process with delivery to the patients' bedside did not always provide ordered medications to the nursing units when they were needed. In 2007 the principles of the Toyota Production System (TPS) were applied to the system. Project objectives were to improve medication safety and reduce the time needed for nurses to retrieve patient medications. A multidisciplinary team was formed that included representatives from nursing, pharmacy, informatics, quality, and various operational support departments. Team members were educated and trained in the tools and techniques of TPS, and then designed and implemented a new pull system benchmarking the TPS Ideal State model. The newly installed process, providing just-in-time medication availability, has measurably improved delivery processes as well as patient safety and satisfaction. Other positive outcomes have included improved nursing satisfaction, reduced nursing wait time for delivered medications, and improved efficiency in the pharmacy. After a successful pilot on two nursing units, the system is being extended to the rest of the hospital. © 2010 National Association for Healthcare Quality.

  10. Formulation of Indomethacin Colon Targeted Delivery Systems Using Polysaccharides as Carriers by Applying Liquisolid Technique

    Directory of Open Access Journals (Sweden)

    Kadria A. Elkhodairy

    2014-01-01

    Full Text Available The present study aimed at the formulation of matrix tablets for colon-specific drug delivery (CSDD system of indomethacin (IDM by applying liquisolid (LS technique. A CSDD system based on time-dependent polymethacrylates and enzyme degradable polysaccharides was established. Eudragit RL 100 (E-RL 100 was employed as time-dependent polymer, whereas bacterial degradable polysaccharides were presented as LS systems loaded with the drug. Indomethacin-loaded LS systems were prepared using different polysaccharides, namely, guar gum (GG, pectin (PEC, and chitosan (CH, as carriers separately or in mixtures of different ratios of 1 : 3, 1 : 1, and 3 : 1. Liquisolid systems that displayed promising results concerning drug release rate in both pH 1.2 and pH 6.8 were compressed into tablets after the addition of the calculated amount of E-RL 100 and lubrication with magnesium stearate and talc in the ratio of 1 : 9. It was found that E-RL 100 improved the flowability and compressibility of all LS formulations. The release data revealed that all formulations succeeded to sustain drug release over a period of 24 hours. Stability study indicated that PEC-based LS system as well as its matrix tablets was stable over the period of storage (one year and could provide a minimum shelf life of two years.

  11. Do trained practice nurses apply motivational interviewing techniques in primary care consultations?

    Science.gov (United States)

    Noordman, Janneke; van Lee, Inge; Nielen, Mark; Vlek, Hans; van Weijden, Trudy; van Dulmen, Sandra

    2012-12-01

    Reducing the prevalence of unhealthy lifestyle behaviour could positively influence health. Motivational interviewing (MI) is used to promote change in unhealthy lifestyle behaviour as part of primary or secondary prevention. Whether MI is actually applied as taught is unknown. Practice nurses' application of motivational interviewing in real-life primary care consultations was examined. Furthermore, we explored if (and to what extent) practice nurses adjust their motivational interviewing skills to primary versus secondary prevention. Thirteen Dutch practice nurses, from four general practices, trained in motivational interviewing participated, 117 adult patients visiting the practice nurse participated, 117 practice nurse-patient consultations between June and December 2010 were videotaped. Motivational interview skills were rated by two observers using the Behaviour Change Counselling Index (BECCI). Data were analyzed using multilevel regression. Practice nurses use motivational interviewing techniques to some extent. Substantial variation was found between motivational interviewing items. No significant differences in the use of motivational interviewing between primary and secondary prevention was found. Motivational interviewing skills are not easily applicable in routine practice. Health care providers who want to acquire motivational interview skills should follow booster sessions after the first training. The training could be strengthened by video-feedback and feedback based on participating observation. A possible explanation for the lack of differences between the two types of prevention consultations may be the gain to help patients in primary consultations by preventing complications equals the necessity to help the disease from aggravating in secondary prevention.

  12. Inverting travel times with a triplication. [spline fitting technique applied to lunar seismic data reduction

    Science.gov (United States)

    Jarosch, H. S.

    1982-01-01

    A method based on the use of constrained spline fits is used to overcome the difficulties arising when body-wave data in the form of T-delta are reduced to the tau-p form in the presence of cusps. In comparison with unconstrained spline fits, the method proposed here tends to produce much smoother models which lie approximately in the middle of the bounds produced by the extremal method. The method is noniterative and, therefore, computationally efficient. The method is applied to the lunar seismic data, where at least one triplication is presumed to occur in the P-wave travel-time curve. It is shown, however, that because of an insufficient number of data points for events close to the antipode of the center of the lunar network, the present analysis is not accurate enough to resolve the problem of a possible lunar core.

  13. Diffusing wave spectroscopy applied to material analysis and process control

    International Nuclear Information System (INIS)

    Lloyd, Christopher James

    1997-01-01

    Diffusing Wave Spectroscopy (DWS) was studied as a method of laboratory analysis of sub-micron particles, and developed as a prospective in-line, industrial, process control sensor, capable of near real-time feedback. No sample pre-treatment was required and measurement was via a non-invasive, flexible, dip in probe. DWS relies on the concept of the diffusive migration of light, as opposed to the ballistic scatter model used in conventional dynamic light scattering. The specific requirements of the optoelectronic hardware, data analysis methods and light scattering model were studied experimentally and, where practical, theoretically resulting in a novel technique of analysis of particle suspensions and emulsions of volume fractions between 0.01 and 0.4. Operation at high concentrations made the technique oblivious to dust and contamination. A pure homodyne (autodyne) experimental arrangement described was resilient to environmental disturbances, unlike many other systems which utilise optical fibres or heterodyne operation. Pilot and subsequent prototype development led to a highly accurate method of size ranking, suitable for analysis of a wide range of suspensions and emulsions. The technique was shown to operate on real industrial samples with statistical variance as low as 0.3% with minimal software processing. Whilst the application studied was the analysis of TiO 2 suspensions, a diverse range of materials including polystyrene beads, cell pastes and industrial cutting fluid emulsions were tested. Results suggest that, whilst all sizing should be comparative to suitable standards, concentration effects may be minimised and even completely modelled-out in many applications. Adhesion to the optical probe was initially a significant problem but was minimised after the evaluation and use of suitable non stick coating materials. Unexpected behaviour in the correlation in the region of short decay times led to consideration of the effects of rotational diffusion

  14. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.

    2002-01-01

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  15. Quantitative analysis of caffeine applied to pharmaceutical industry

    Science.gov (United States)

    Baucells, M.; Ferrer, N.; Gómez, P.; Lacort, G.; Roura, M.

    1993-03-01

    The direct determination of some compounds like caffeine in pharmaceutical samples without sample pretreatment and without the separation of these compounds from the matrix (acetyl salicylic acid, paracetamol,…) is very worthwhile. It enables analysis to be performed quickly and without the problems associated with sample manipulation. The samples were diluted directly in KBr powder. We used both diffuse reflectance (DRIFT) and transmission techniques in order to measure the intensity of the peaks of the caffeine in the pharmaceutical matrix. Limits of detection, determination, relative standard deviation and recovery using caffeine in the same matrix as in the pharmaceutical product are related. Two methods for the quantification of caffeine were used: calibration line and standard addition techniques.

  16. Analysis of Brick Masonry Wall using Applied Element Method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  17. Analysis of concrete beams using applied element method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  18. Hurdles run technique analysis in the 400m hurdles

    OpenAIRE

    Drtina, Martin

    2010-01-01

    Hurdles run technique analysis in the 400m hurdles Thesis objectives: The main objective is to compare the technique hurdles run in the race tempo on the track 400 m hurdles at the selected probands. Tasks are identified kinematic parameters separately for each proband and identify their weaknesses in technique. Method: Analysis techniques hurdles run was done by using 3D kinematic analysis. Observed space-time events were recorded on two digital cameras. Records was transferred to a suitable...

  19. Application of Microfluidic Techniques to Pyrochemical Salt Sampling and Analysis

    International Nuclear Information System (INIS)

    Pereira, C.; Launiere, C.; Smith, N.

    2015-01-01

    Microfluidic techniques enable production of micro-samples of molten salt for analysis by at-line and off-line sensors and detectors. These sampling systems are intended for implementation in an electrochemical used fuel treatment facility as part of the material balance and control system. Microfluidics may reduce random statistical error associated with sampling inhomogeneity because a large number of uniform sub-microlitre droplets may be generated and successively analyzed. The approach combines two immiscible fluids in a microchannel under laminar flow conditions to generate slug flows. Because the slug flow regime is characterized by regularly sized and spaced droplets, it is commonly used in low-volume/high-throughput assays of aqueous and organic phases. This scheme is now being applied to high-temperature molten salts in combination with a second fluid that is stable at elevated temperatures. The microchip systems are being tested to determine the channel geometries and absolute and relative phase flow rates required to achieve stable slug flow. Because imaging is difficult at the 5000 C process temperatures the fluorescence of salt ions under ultraviolet illumination is used to discern flow regimes. As molten chloride melts are optically transparent, UV-visible light spectroscopy is also being explored as a spectroscopic technique for integration with at-line microchannel systems to overcome some of the current challenges to in situ analysis. A second technique that is amenable to droplet analysis is Laser-induced Breakdown Spectroscopy (LIBS). A pneumatic droplet generator is being interfaced with a LIBS system for analysis of molten salts at near-process temperatures. Tests of the pneumatic generator are being run using water and molten salts, and in tandem with off-line analysis of the salt droplets with a LIBS spectrometer. (author)

  20. Evaluation of bitterness in white wine applying descriptive analysis, time-intensity analysis, and temporal dominance of sensations analysis.

    Science.gov (United States)

    Sokolowsky, Martina; Fischer, Ulrich

    2012-06-30

    Bitterness in wine, especially in white wine, is a complex and sensitive topic as it is a persistent sensation with negative connotation by consumers. However, the molecular base for bitter taste in white wines is still widely unknown yet. At the same time studies dealing with bitterness have to cope with the temporal dynamics of bitter perception. The most common method to describe bitter taste is the static measurement amongst other attributes during a descriptive analysis. A less frequently applied method, the time-intensity analysis, evaluates the temporal gustatory changes focusing on bitterness alone. The most recently developed multidimensional approach of the temporal dominance of sensations method reveals the temporal dominance of bitter taste in relation to other attributes. In order to compare the results comprised with these different sensory methodologies, 13 commercial white wines were evaluated by the same panel. To facilitate a statistical comparison, parameters were extracted from bitterness curves obtained from time-intensity and temporal dominance of sensations analysis and were compared to bitter intensity as well as bitter persistency based on descriptive analysis. Analysis of variance differentiated significantly the wines regarding all measured bitterness parameters obtained from the three sensory techniques. Comparing the information of all sensory parameters by multiple factor analysis and correlation, each technique provided additional valuable information regarding the complex bitter perception in white wine. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  2. Investigation about the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils

    Directory of Open Access Journals (Sweden)

    Adriano Pinto Mariano

    2009-10-01

    Full Text Available This work investigated the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils collected at three service stations. Batch biodegradation experiments were carried out in Bartha biometer flasks (250 mL used to measure the microbial CO2 production. Biodegradation efficiency was also measured by quantifying the concentration of hydrocarbons. In addition to the biodegradation experiments, the capability of the studied cultures and the native microorganisms to biodegrade the diesel oil purchased from a local service station, was verified using a technique based on the redox indicator 2,6 -dichlorophenol indophenol (DCPIP. Results obtained with this test showed that the inocula used in the biodegradation experiments were able to degrade the diesel oil and the tests carried out with the native microorganisms indicated that these soils had a microbiota adapted to degrade the hydrocarbons. In general, no gain was obtained with the addition of microorganisms or even negative effects were observed in the biodegradation experiments.Este trabalho investigou a eficiência da técnica do bioaumento quando aplicada a solos contaminados com óleo diesel coletados em três postos de combustíveis. Experimentos de biodegradação foram realizados em frascos de Bartha (250 mL, usados para medir a produção microbiana de CO2. A eficiência de biodegradação também foi quantificada pela concentração de hidrocarbonetos. Conjuntamente aos experimentos de biodegradação, a capacidade das culturas estudadas e dos microrganismos nativos em biodegradar óleo diesel comprado de um posto de combustíveis local, foi verificada utilizando-se a técnica baseada no indicador redox 2,6 - diclorofenol indofenol (DCPIP. Resultados obtidos com esse teste mostraram que os inóculos empregados nos experimentos de biodegradação foram capazes de biodegradar óleo diesel e os testes com os microrganismos nativos indicaram que estes solos

  3. A dynamic mechanical analysis technique for porous media.

    Science.gov (United States)

    Pattison, Adam Jeffry; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-02-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite-element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a nonlinear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1-14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  4. A comparison of new, old and future densiometic techniques as applied to volcanologic study.

    Science.gov (United States)

    Pankhurst, Matthew; Moreland, William; Dobson, Kate; Þórðarson, Þorvaldur; Fitton, Godfrey; Lee, Peter

    2015-04-01

    The density of any material imposes a primary control upon its potential or actual physical behaviour in relation to its surrounds. It follows that a thorough understanding of the physical behaviour of dynamic, multi-component systems, such as active volcanoes, requires knowledge of the density of each component. If we are to accurately predict the physical behaviour of synthesized or natural volcanic systems, quantitative densiometric measurements are vital. The theoretical density of melt, crystals and bubble phases may be calculated using composition, structure, temperature and pressure inputs. However, measuring the density of natural, non-ideal, poly-phase materials remains problematic, especially if phase specific measurement is important. Here we compare three methods; Archimedes principle, He-displacement pycnometry and X-ray micro computed tomography (XMT) and discuss the utility and drawbacks of each in the context of modern volcanologic study. We have measured tephra, ash and lava from the 934 AD Eldgjá eruption (Iceland), and the 2010 AD Eyjafjallajökull eruption (Iceland), using each technique. These samples exhibit a range of particle sizes, phases and textures. We find that while the Archimedes method remains a useful, low-cost technique to generate whole-rock density data, relative precision is problematic at small particles sizes. Pycnometry offers a more precise whole-rock density value, at a comparable cost-per-sample. However, this technique is based upon the assumption pore spaces within the sample are equally available for gas exchange, which may or may not be the case. XMT produces 3D images, at resolutions from nm to tens of µm per voxel where X-ray attenuation is a qualitative measure of relative electron density, expressed as greyscale number/brightness (usually 16-bit). Phases and individual particles can be digitally segmented according to their greyscale and other characteristics. This represents a distinct advantage over both

  5. Applied Hierarchical Cluster Analysis with Average Linkage Algoritm

    Directory of Open Access Journals (Sweden)

    Cindy Cahyaning Astuti

    2017-11-01

    Full Text Available This research was conducted in Sidoarjo District where source of data used from secondary data contained in the book "Kabupaten Sidoarjo Dalam Angka 2016" .In this research the authors chose 12 variables that can represent sub-district characteristics in Sidoarjo. The variable that represents the characteristics of the sub-district consists of four sectors namely geography, education, agriculture and industry. To determine the equitable geographical conditions, education, agriculture and industry each district, it would require an analysis to classify sub-districts based on the sub-district characteristics. Hierarchical cluster analysis is the analytical techniques used to classify or categorize the object of each case into a relatively homogeneous group expressed as a cluster. The results are expected to provide information about dominant sub-district characteristics and non-dominant sub-district characteristics in four sectors based on the results of the cluster is formed.

  6. LAMQS analysis applied to ancient Egyptian bronze coins

    International Nuclear Information System (INIS)

    Torrisi, L.; Caridi, F.; Giuffrida, L.; Torrisi, A.; Mondio, G.; Serafino, T.; Caltabiano, M.; Castrizio, E.D.; Paniz, E.; Salici, A.

    2010-01-01

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  7. Imaging techniques applied to quality control of civil manufactured goods obtained starting from ready-to-use mixtures

    Science.gov (United States)

    Bonifazi, Giuseppe; Castaldi, Federica

    2003-05-01

    Concrete materials obtained from the utilization of pre-mixed and ready to use products (central mix-concrete) are more and more used. They represent a big portion of the civil construction market. Such products are used at different scale, ranging from small scale works, as those commonly realized inside and house, an apartment, etc. or at big civil or industrial scale works. In both cases the problem to control the mixtures and the final work is usually realized through the analysis of properly collected samples. Through appropriate sampling it can be derived objective parameters, as size class distribution and composition of the constituting particulate matter, or mechanical characteristics of the sample itself. An important parameter not considered by the previous mentioned approach is "segregation", that is the possibility that some particulate materials migrate preferentially in some zones of the mixtures and/or of the final product. Such a behavior dramatically influences the quality of the product and of the final manufactured good. Actually this behavior is only studied adopting a human based visual approach. Not repeatable analytical procedures or quantitative data processing exist. In this paper a procedure fully based on image processing techniques is described and applied. Results are presented and analyzed with reference to industrial products. A comparison is also made between the new proposed digital imaging based techniques and the analyses usually carried out at industrial laboratory scale for standard quality control.

  8. Second law analysis and simulation techniques for the energy optimization of buildings

    OpenAIRE

    Terlizzese, Tiziano

    2010-01-01

    The research activity described in this thesis is focused mainly on the study of finite-element techniques applied to thermo-fluid dynamic problems of plant components and on the study of dynamic simulation techniques applied to integrated building design in order to enhance the energy performance of the building. The first part of this doctorate thesis is a broad dissertation on second law analysis of thermodynamic processes with the purpose of including the issue of the energy efficiency of...

  9. Applied research and development of neutron activation analysis

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Baek, Sung Ryel; Kim, Young Gi; Jung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun; Lim, Jong Myoung

    2003-05-01

    The aims of this project are to establish the quality control system of Neutron Activation Analysis(NAA) due to increase of industrial needs for standard analytical method and to prepare and identify the standard operation procedure of NAA through practical testing for different analytical items. R and D implementations of analytical quality system using neutron irradiation facility and gamma-ray measurement system and automation of NAA facility in HANARO research reactor are as following ; 1) Establishment of NAA quality control system for the maintenance of best measurement capability and the promotion of utilization of HANARO research reactor 2) Improvement of analytical sensitivity for industrial applied technologies and establishment of certified standard procedures 3) Standardization and development of Prompt Gamma-ray Activation Analysis (PGAA) technology

  10. Statistical analysis and Kalman filtering applied to nuclear materials accountancy

    International Nuclear Information System (INIS)

    Annibal, P.S.

    1990-08-01

    Much theoretical research has been carried out on the development of statistical methods for nuclear material accountancy. In practice, physical, financial and time constraints mean that the techniques must be adapted to give an optimal performance in plant conditions. This thesis aims to bridge the gap between theory and practice, to show the benefits to be gained from a knowledge of the facility operation. Four different aspects are considered; firstly, the use of redundant measurements to reduce the error on the estimate of the mass of heavy metal in an 'accountancy tank' is investigated. Secondly, an analysis of the calibration data for the same tank is presented, establishing bounds for the error and suggesting a means of reducing them. Thirdly, a plant-specific method of producing an optimal statistic from the input, output and inventory data, to help decide between 'material loss' and 'no loss' hypotheses, is developed and compared with existing general techniques. Finally, an application of the Kalman Filter to materials accountancy is developed, to demonstrate the advantages of state-estimation techniques. The results of the analyses and comparisons illustrate the importance of taking into account a complete and accurate knowledge of the plant operation, measurement system, and calibration methods, to derive meaningful results from statistical tests on materials accountancy data, and to give a better understanding of critical random and systematic error sources. The analyses were carried out on the head-end of the Fast Reactor Reprocessing Plant, where fuel from the prototype fast reactor is cut up and dissolved. However, the techniques described are general in their application. (author)

  11. Cochlear implant simulator for surgical technique analysis

    Science.gov (United States)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  12. Laser granulometry: A comparative study the techniques of sieving and elutriation applied to pozzoianic materials

    Directory of Open Access Journals (Sweden)

    Frías, M.

    1990-03-01

    Full Text Available Laser granulometry is a rapid method for determination of particle size distribution in both dry and wet phases. The present paper, diffraction technique by laser beams is an application to the granulometric studies of pozzolanic materials in suspension. Theses granulometric analysis are compared to those obtained with the Alpine pneumatic-siever and Bahco elutriator-centrifuge.

    La granulometria laser es un método rápido para determinar distribuciones de tamaños de partícula tanto en vía seca como en húmeda. En este trabajo la técnica de difracción por rayos laser se aplica al estudio granulométrico de materiales puzolánicos en suspensión. Estos análisis granulométricos se cotejan con los obtenidos con la técnica tamizador-neumático Alpine y elutriador-centrifugador Bahco.

  13. Digital Image Correlation Techniques Applied to Large Scale Rocket Engine Testing

    Science.gov (United States)

    Gradl, Paul R.

    2016-01-01

    Rocket engine hot-fire ground testing is necessary to understand component performance, reliability and engine system interactions during development. The J-2X upper stage engine completed a series of developmental hot-fire tests that derived performance of the engine and components, validated analytical models and provided the necessary data to identify where design changes, process improvements and technology development were needed. The J-2X development engines were heavily instrumented to provide the data necessary to support these activities which enabled the team to investigate any anomalies experienced during the test program. This paper describes the development of an optical digital image correlation technique to augment the data provided by traditional strain gauges which are prone to debonding at elevated temperatures and limited to localized measurements. The feasibility of this optical measurement system was demonstrated during full scale hot-fire testing of J-2X, during which a digital image correlation system, incorporating a pair of high speed cameras to measure three-dimensional, real-time displacements and strains was installed and operated under the extreme environments present on the test stand. The camera and facility setup, pre-test calibrations, data collection, hot-fire test data collection and post-test analysis and results are presented in this paper.

  14. Applying Geospatial Techniques to Investigate Boundary Layer Land-Atmosphere Interactions Involved in Tornadogensis

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Knupp, K. R.; Molthan, A.; Coleman, T.

    2017-12-01

    Northern Alabama is among the most tornado-prone regions in the United States. This region has a higher degree of spatial variability in both terrain and land cover than the more frequently studied North American Great Plains region due to its proximity to the southern Appalachian Mountains and Cumberland Plateau. More research is needed to understand North Alabama's high tornado frequency and how land surface heterogeneity influences tornadogenesis in the boundary layer. Several modeling and simulation studies stretching back to the 1970's have found that variations in the land surface induce tornadic-like flow near the surface, illustrating a need for further investigation. This presentation introduces research investigating the hypothesis that horizontal gradients in land surface roughness, normal to the direction of flow in the boundary layer, induce vertically oriented vorticity at the surface that can potentially aid in tornadogenesis. A novel approach was implemented to test this hypothesis using a GIS-based quadrant pattern analysis method. This method was developed to quantify spatial relationships and patterns between horizontal variations in land surface roughness and locations of tornadogenesis. Land surface roughness was modeled using the Noah land surface model parameterization scheme which, was applied to MODIS 500 m and Landsat 30 m data in order to compare the relationship between tornadogenesis locations and roughness gradients at different spatial scales. This analysis found a statistical relationship between areas of higher roughness located normal to flow surrounding tornadogenesis locations that supports the tested hypothesis. In this presentation, the innovative use of satellite remote sensing data and GIS technologies to address interactions between the land and atmosphere will be highlighted.

  15. Applying a novel combination of techniques to develop a predictive model for diabetes complications.

    Science.gov (United States)

    Sangi, Mohsen; Win, Khin Than; Shirvani, Farid; Namazi-Rad, Mohammad-Reza; Shukla, Nagesh

    2015-01-01

    Among the many related issues of diabetes management, its complications constitute the main part of the heavy burden of this disease. The aim of this paper is to develop a risk advisor model to predict the chances of diabetes complications according to the changes in risk factors. As the starting point, an inclusive list of (k) diabetes complications and (n) their correlated predisposing factors are derived from the existing endocrinology text books. A type of data meta-analysis has been done to extract and combine the numeric value of the relationships between these two. The whole n (risk factors) - k (complications) model was broken down into k different (n-1) relationships and these (n-1) dependencies were broken into n (1-1) models. Applying regression analysis (seven patterns) and artificial neural networks (ANN), we created models to show the (1-1) correspondence between factors and complications. Then all 1-1 models related to an individual complication were integrated using the naïve Bayes theorem. Finally, a Bayesian belief network was developed to show the influence of all risk factors and complications on each other. We assessed the predictive power of the 1-1 models by R2, F-ratio and adjusted R2 equations; sensitivity, specificity and positive predictive value were calculated to evaluate the final model using real patient data. The results suggest that the best fitted regression models outperform the predictive ability of an ANN model, as well as six other regression patterns for all 1-1 models.

  16. Applying a novel combination of techniques to develop a predictive model for diabetes complications.

    Directory of Open Access Journals (Sweden)

    Mohsen Sangi

    Full Text Available Among the many related issues of diabetes management, its complications constitute the main part of the heavy burden of this disease. The aim of this paper is to develop a risk advisor model to predict the chances of diabetes complications according to the changes in risk factors. As the starting point, an inclusive list of (k diabetes complications and (n their correlated predisposing factors are derived from the existing endocrinology text books. A type of data meta-analysis has been done to extract and combine the numeric value of the relationships between these two. The whole n (risk factors - k (complications model was broken down into k different (n-1 relationships and these (n-1 dependencies were broken into n (1-1 models. Applying regression analysis (seven patterns and artificial neural networks (ANN, we created models to show the (1-1 correspondence between factors and complications. Then all 1-1 models related to an individual complication were integrated using the naïve Bayes theorem. Finally, a Bayesian belief network was developed to show the influence of all risk factors and complications on each other. We assessed the predictive power of the 1-1 models by R2, F-ratio and adjusted R2 equations; sensitivity, specificity and positive predictive value were calculated to evaluate the final model using real patient data. The results suggest that the best fitted regression models outperform the predictive ability of an ANN model, as well as six other regression patterns for all 1-1 models.

  17. Multidimensional scaling technique for analysis of magnetic storms ...

    Indian Academy of Sciences (India)

    Abstract. Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  18. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  19. Single Particle Tracking: Analysis Techniques for Live Cell Nanoscopy

    Science.gov (United States)

    Relich, Peter Kristopher, II

    Single molecule experiments are a set of experiments designed specifically to study the properties of individual molecules. It has only been in the last three decades where single molecule experiments have been applied to the life sciences; where they have been successfully implemented in systems biology for probing the behaviors of sub-cellular mechanisms. The advent and growth of super-resolution techniques in single molecule experiments has made the fundamental behaviors of light and the associated nano-probes a necessary concern amongst life scientists wishing to advance the state of human knowledge in biology. This dissertation disseminates some of the practices learned in experimental live cell microscopy. The topic of single particle tracking is addressed here in a format that is designed for the physicist who embarks upon single molecule studies. Specifically, the focus is on the necessary procedures to generate single particle tracking analysis techniques that can be implemented to answer biological questions. These analysis techniques range from designing and testing a particle tracking algorithm to inferring model parameters once an image has been processed. The intellectual contributions of the author include the techniques in diffusion estimation, localization filtering, and trajectory associations for tracking which will all be discussed in detail in later chapters. The author of this thesis has also contributed to the software development of automated gain calibration, live cell particle simulations, and various single particle tracking packages. Future work includes further evaluation of this laboratory's single particle tracking software, entropy based approaches towards hypothesis validations, and the uncertainty quantification of gain calibration.

  20. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  1. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  2. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  3. Machine Learning Techniques Applied to Profile Mobile Banking Users in India

    OpenAIRE

    M. Carr; V. Ravi; G. Sridharan Reddy; D. Veranna

    2013-01-01

    This paper profiles mobile banking users using machine learning techniques viz. Decision Tree, Logistic Regression, Multilayer Perceptron, and SVM to test a research model with fourteen independent variables and a dependent variable (adoption). A survey was conducted and the results were analysed using these techniques. Using Decision Trees the profile of the mobile banking adopter’s profile was identified. Comparing different machine learning techniques it was found that Decision Trees out...

  4. Applying data mining techniques to medical time series: an empirical case study in electroencephalography and stabilometry

    Directory of Open Access Journals (Sweden)

    A. Anguera

    2016-01-01

    This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques.

  5. Biomechanical study of the funnel technique applied in thoracic pedicle screw replacement.

    Science.gov (United States)

    Huang, Yi-Jiang; Peng, Mao-Xiu; He, Shao-Qi; Liu, Liang-Le; Dai, Ming-Hai; Tang, Chenxuan

    2014-09-01

    Funnel technique is a method used for the insertion of screw into thoracic pedicle. To evaluate the biomechanical characteristics of thoracic pedicle screw placement using the Funnel technique, trying to provide biomechanical basis for clinical application of this technology. 14 functional spinal units (T6 to T10) were selected from thoracic spine specimens of 14 fresh adult cadavers, and randomly divided into two groups, including Funnel technique group (n = 7) and Magerl technique group (n = 7). The displacement-stiffness and pull-out strength in all kinds of position were tested and compared. Two fixed groups were significantly higher than that of the intact state (P 0.05). The mean pull-out strength in Funnel technique group (789.09 ± 27.33) was lower than that in Magerl technique group (P Funnel technique for the insertion point of posterior bone is a safe and accurate technique for pedicle screw placement. It exhibited no effects on the stiffness of spinal column, but decreased the pull-out strength of pedicle screw. Therefore, the funnel technique in the thoracic spine affords an alternative for the standard screw placement.

  6. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  7. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  8. Recent trends in particle size analysis techniques

    Science.gov (United States)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  9. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C [University of Washington, Seattle; Abgrall, N. [Lawrence Berkeley National Laboratory (LBNL); Arnquist, I. J. [Pacific Northwest National Laboratory (PNNL); Avignone, III, F. T. [University of South Carolina/Oak Ridge National Laboratory (ORNL); Baldenegro-Barrera, C. X. [Oak Ridge National Laboratory (ORNL); Barabash, A.S. [Institute of Theoretical & Experimental Physics (ITEP), Moscow, Russia; Bertrand, F. E. [Oak Ridge National Laboratory (ORNL); Bradley, A. W. [Lawrence Berkeley National Laboratory (LBNL); Brudanin, V. [Joint Institute for Nuclear Research, Dubna, Russia; Busch, M. [Duke University/TUNL; Buuck, M. [University of Washington, Seattle; Byram, D. [University of South Dakota; Caldwell, A. S. [South Dakota School of Mines and Technology; Chan, Y-D [Lawrence Berkeley National Laboratory (LBNL); Christofferson, C. D. [South Dakota School of Mines and Technology; Detwiler, J. A. [University of Washington, Seattle; Efremenko, Yu. [University of Tennessee, Knoxville (UTK); Ejiri, H. [Osaka University, Japan; Elliott, S. R. [Los Alamos National Laboratory (LANL); Galindo-Uribarri, A. [Oak Ridge National Laboratory (ORNL); Gilliss, T. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Giovanetti, G. K. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Goett, J [Los Alamos National Laboratory (LANL); Green, M. P. [Oak Ridge National Laboratory (ORNL); Gruszko, J [University of Washington, Seattle; Guinn, I S [University of Washington, Seattle; Guiseppe, V E [University of South Carolina, Columbia; Henning, R. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Hoppe, E.W. [Pacific Northwest National Laboratory (PNNL); Howard, S. [South Dakota School of Mines and Technology; Howe, M. A. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Jasinski, B R [University of South Dakota; Keeter, K.J. [Black Hills State University, Spearfish, South Dakota; Kidd, M. F. [Tennessee Technological University (TTU); Konovalov, S.I. [Institute of Theoretical & Experimental Physics (ITEP), Moscow, Russia; Kouzes, R. T. [Pacific Northwest National Laboratory (PNNL); LaFerriere, B. D. [Pacific Northwest National Laboratory (PNNL); Leon, J. [University of Washington, Seattle; MacMullin, J. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Martin, R. D. [University of South Dakota; Meijer, S. J. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Mertens, S. [Lawrence Berkeley National Laboratory (LBNL); Orrell, J. L. [Pacific Northwest National Laboratory (PNNL); O' Shaughnessy, C. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Poon, A.W.P. [Lawrence Berkeley National Laboratory (LBNL); Radford, D. C. [Oak Ridge National Laboratory (ORNL); Rager, J. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Rielage, K. [Los Alamos National Laboratory (LANL); Robertson, R.G.H. [University of Washington, Seattle; Romero-Romero, E. [University of Tennessee, Knoxville, (UTK)/Oak Ridge National Lab (ORNL); Shanks, B. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Shirchenko, M. [Joint Institute for Nuclear Research, Dubna, Russia; Snyder, N [University of South Dakota; Suriano, A. M. [South Dakota School of Mines and Technology; Tedeschi, D [University of South Carolina, Columbia; Trimble, J. E. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Varner, R. L. [Oak Ridge National Laboratory (ORNL); Vasilyev, S. [Joint Institute for Nuclear Research, Dubna, Russia; Vetter, K. [University of California/Lawrence Berkeley National Laboratory (LBNL); et al.

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in Ge-76. In view of the next generation of tonne-scale Ge-based 0 nu beta beta-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  10. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  11. Applying Authentic Data Analysis in Learning Earth Atmosphere

    Science.gov (United States)

    Johan, H.; Suhandi, A.; Samsudin, A.; Wulan, A. R.

    2017-09-01

    The aim of this research was to develop earth science learning material especially earth atmosphere supported by science research with authentic data analysis to enhance reasoning through. Various earth and space science phenomenon require reasoning. This research used experimental research with one group pre test-post test design. 23 pre-service physics teacher participated in this research. Essay test was conducted to get data about reason ability. Essay test was analyzed quantitatively. Observation sheet was used to capture phenomena during learning process. The results showed that student’s reasoning ability improved from unidentified and no reasoning to evidence based reasoning and inductive/deductive rule-based reasoning. Authentic data was considered using Grid Analysis Display System (GrADS). Visualization from GrADS facilitated students to correlate the concepts and bring out real condition of nature in classroom activity. It also helped student to reason the phenomena related to earth and space science concept. It can be concluded that applying authentic data analysis in learning process can help to enhance students reasoning. This study is expected to help lecture to bring out result of geoscience research in learning process and facilitate student understand concepts.

  12. Neutron activation analysis applied to nutritional and foodstuff studies

    Energy Technology Data Exchange (ETDEWEB)

    Maihara, Vera A.; Santos, Paola S.; Moura, Patricia L.C.; Castro, Lilian P. de, E-mail: vmaihara@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Avegliano, Roseane P., E-mail: pagliaro@usp.b [Universidade de Sao Paulo (USP), SP (Brazil). Coordenadoria de Assistencia Social. Div. de Alimentacao

    2009-07-01

    Neutron Activation Analysis, NAA, has been successfully used on a regularly basis in several areas of nutrition and foodstuffs. NAA has become an important and useful research tool due to the methodology's advantages. These include high accuracy, small quantities of samples and no chemical treatment. This technique allows the determination of important elements directly related to human health. NAA also provides data concerning essential and toxic concentrations in foodstuffs and specific diets. In this paper some studies in the area of nutrition which have been carried out at the Neutron Activation Laboratory of IPEN/CNEN-SP will be presented: a Brazilian total diet study: nutritional element dietary intakes of Sao Paulo state population; a study of trace element in maternal milk and the determination of essential trace elements in some edible mushrooms. (author)

  13. Neutron activation analysis applied to nutritional and foodstuff studies

    International Nuclear Information System (INIS)

    Maihara, Vera A.; Santos, Paola S.; Moura, Patricia L.C.; Castro, Lilian P. de; Avegliano, Roseane P.

    2009-01-01

    Neutron Activation Analysis, NAA, has been successfully used on a regularly basis in several areas of nutrition and foodstuffs. NAA has become an important and useful research tool due to the methodology's advantages. These include high accuracy, small quantities of samples and no chemical treatment. This technique allows the determination of important elements directly related to human health. NAA also provides data concerning essential and toxic concentrations in foodstuffs and specific diets. In this paper some studies in the area of nutrition which have been carried out at the Neutron Activation Laboratory of IPEN/CNEN-SP will be presented: a Brazilian total diet study: nutritional element dietary intakes of Sao Paulo state population; a study of trace element in maternal milk and the determination of essential trace elements in some edible mushrooms. (author)

  14. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    Science.gov (United States)

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD.

  15. Evaluation of energy system analysis techniques for identifying underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C. [and others

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  16. Photothermal techniques applied to the study of thermal properties in biodegradable films

    Science.gov (United States)

    San Martín-Martínez, E.; Aguilar-Méndez, M. A.; Cruz-Orea, A.; García-Quiroz, A.

    2008-01-01

    The objective of the present work was to determine the thermal diffusivity and effusivity of biodegradable films by using photothermal techniques. The thermal diffusivity was studied by using the open photoacoustic cell technique. On the other hand the thermal effusivity was obtained by the photopyroelectric technique in a front detection configuration. The films were elaborated from mixtures of low density polyethylene (LDPE) and corn starch. The results showed that at high moisture values, the thermal diffusivity increased as the starch concentration was higher in the film. However at low moisture conditions (low extrusion moisture conditions (6.55%). As the moisture and starch concentration in the films were increased, the thermal effusivity diminished.

  17. Quantitative thoracic CT techniques in adults: can they be applied in the pediatric population?

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Soon Ho [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Goo, Jin Mo [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Seoul National University College of Medicine, Cancer Research Institute, Jongno-gu, Seoul (Korea, Republic of); Goo, Hyun Woo [University of Ulsan College of Medicine, Department of Radiology and Research Institute of Radiology, Asan Medical Center, Seoul (Korea, Republic of)

    2013-03-15

    With the rapid evolution of the multidetector row CT technique, quantitative CT has started to be used in clinical studies for revealing a heterogeneous entity of airflow limitation in chronic obstructive pulmonary disease that is caused by a combination of lung parenchymal destruction and remodeling of the small airways in adults. There is growing evidence of a good correlation between quantitative CT findings and pathological findings, pulmonary function test results and other clinical parameters. This article provides an overview of current quantitative thoracic CT techniques used in adults, and how to translate these CT techniques to the pediatric population. (orig.)

  18. Neutron Filter Technique and its use for Fundamental and applied Investigations

    International Nuclear Information System (INIS)

    Gritzay, V.; Kolotyi, V.

    2008-01-01

    At Kyiv Research Reactor (KRR) the neutron filtered beam technique is used for more than 30 years and its development continues, the new and updated facilities for neutron cross section measurements provide the receipt of neutron cross sections with rather high accuracy: total neutron cross sections with accuracy 1% and better, neutron scattering cross sections with 3-6% accuracy. The main purpose of this paper is presentation of the neutron measurement techniques, developed at KRR, and demonstration some experimental results, obtained using these techniques

  19. Calorimetric techniques applied to the thermodynamic study of interactions between proteins and polysaccharides

    Directory of Open Access Journals (Sweden)

    Monique Barreto Santos

    2016-08-01

    Full Text Available ABSTRACT: The interactions between biological macromolecules have been important for biotechnology, but further understanding is needed to maximize the utility of these interactions. Calorimetric techniques provide information regarding these interactions through the thermal energy that is produced or consumed during interactions. Notable techniques include differential scanning calorimetry, which generates a thermodynamic profile from temperature scanning, and isothermal titration calorimetry that provide the thermodynamic parameters directly related to the interaction. This review described how calorimetric techniques can be used to study interactions between proteins and polysaccharides, and provided valuable insight into the thermodynamics of their interaction.

  20. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  1. Pair distribution function analysis applied to decahedral gold nanoparticles

    Science.gov (United States)

    Nakotte, H.; Silkwood, C.; Page, K.; Wang, H.-W.; Olds, D.; Kiefer, B.; Manna, S.; Karpov, D.; Fohtung, E.; Fullerton, E. E.

    2017-11-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  2. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  3. Nuclear analytical techniques applied to the research on biokinetics of incorporated radionuclides for internal dosimetry

    International Nuclear Information System (INIS)

    Cantone, M.C.

    2005-01-01

    Full text: The presentation intends to discuss the contribution that techniques of analysis, based on activation analysis or mass spectrometry, can give to a very selected item of the protection against ionizing radiation: the biokinetics of relevant elements. The assessment of radiation dose to body tissues, following intakes of radionuclides in occupational, accidental exposures and environmental exposure in case of dispersion in the environment of radio contaminants of potential concerns, is essential to evaluate and manage the the related radiological risk, including the decisions and actions to be undertake. Internal dose is not directly measurable and the International Commission on Radiological Protection ICRP has developed models which describes the behavior of the substances in the human body, following their entry ways by inhalation or ingestion. Generally, all the available sources of information contribute in the modeling process, including studies on animals, use of chemical analogues and, obviously direct information on humans, which is definitely the preferred source on which a biokinetic model can be based. Biokinetic data on human are available for most of the biological essential elements (Fe, Zn, Cu, Se) and for some elements the metabolic behavior is well know due to their use in clinical application (I, Sr, Tc), moreover research is in progress for non-essential alpha emitters. However, for a number of element, including elements with radionuclide of radiological significance in case of environmental contamination (Ru, Zr, Ce, Te and Mo), human data are poor or missing and biokinetic parameters are essentially extrapolated from data on animals. The use of stable isotopes is a publicly well acceptable and ethically justifiable method, compared to the use of radioisotopes, when volunteer subjects are considered in the investigations. The design of the investigation is based on the double tracer approach: one isotope is given orally and a second

  4. A strategy to apply quantitative epistasis analysis on developmental traits.

    Science.gov (United States)

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  5. An analysis of induction motor testing techniques

    International Nuclear Information System (INIS)

    Soergel, S.

    1996-01-01

    There are two main failure mechanisms in induction motors: bearing related and stator related. The Electric Power Research Institute (EPRI) conducted a study which was completed in 1985, and found that near 37% of all failures were attributed to stator problems. Another data source for motor failures is the Nuclear Plant Reliability Data System (NPRDS). This database reveals that approximately 55% of all motors were identified as being degraded before failure occurred. Of these, approximately 35% were due to electrical faults. These are the faults which this paper will attempt to identify through testing techniques. This paper is a discussion of the current techniques used to predict incipient failure of induction motors. In the past, the main tests were those to assess the integrity of the ground insulation. However, most insulation failures are believed to involve turn or strand insulation, which makes traditional tests alone inadequate for condition assessment. Furthermore, these tests have several limitations which need consideration when interpreting the results. This paper will concentrate on predictive maintenance techniques which detect electrical problems. It will present appropriate methods and tests, and discuss the strengths and weaknesses of each

  6. Applying of Reliability Techniques and Expert Systems in Management of Radioactive Accidents

    International Nuclear Information System (INIS)

    Aldaihan, S.; Alhbaib, A.; Alrushudi, S.; Karazaitri, C.

    1998-01-01

    Accidents including radioactive exposure have variety of nature and size. This makes such accidents complex situations to be handled by radiation protection agencies or any responsible authority. The situations becomes worse with introducing advanced technology with high complexity that provide operator huge information about system working on. This paper discusses the application of reliability techniques in radioactive risk management. Event tree technique from nuclear field is described as well as two other techniques from nonnuclear fields, Hazard and Operability and Quality Function Deployment. The objective is to show the importance and the applicability of these techniques in radiation risk management. Finally, Expert Systems in the field of accidents management are explored and classified upon their applications

  7. Applying lean techniques in the delivery of transportation infrastructure construction projects.

    Science.gov (United States)

    2011-07-01

    It is well documented that construction productivity has been declining since the 1960s. Additionally, studies have shown that only : 40% of construction workers time is considered to be value-added work. Interest in the use of Lean techniques ...

  8. System reliability analysis techniques and their relationship to mechanical/structural reliability problems

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1980-01-01

    The paper gives a brief review of the techniques used in the reliability analysis of functional systems. It also considers some of the aspects arising in applying similar techniques to the reliability analysis of mechanical and structural components. The paper concludes that further data acquisition is of prime importance. Additionally, it is suggested that it may be worthwhile to pay increased attention to the on-line monitoring of deterioration in mechanical and structural elements. (orig.)

  9. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  10. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    Directory of Open Access Journals (Sweden)

    Árpád Gyéresi

    2013-02-01

    Full Text Available Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  11. Towards Corrected and Completed Atomic Site Occupancy Analysis of Superalloys Using Atom Probe Tomography Techniques

    Science.gov (United States)

    2012-08-17

    Advanced Atom Probe Tomography (APT) techniques have been developed and applied to the atomic-scale characterization of multi-component...analysis approaches for solute distribution/segregation analysis, atom probe crystallography, and lattice rectification and has demonstrated potential...materials design, where Integrated Computational Materials engineering (ICME) can be enabled by real world 3D atomic resolution data via atom probe microscopy.

  12. Applying machine learning and image feature extraction techniques to the problem of cerebral aneurysm rupture

    Directory of Open Access Journals (Sweden)

    Steren Chabert

    2017-01-01

    Full Text Available Cerebral aneurysm is a cerebrovascular disorder characterized by a bulging in a weak area in the wall of an artery that supplies blood to the brain. It is relevant to understand the mechanisms leading to the apparition of aneurysms, their growth and, more important, leading to their rupture. The purpose of this study is to study the impact on aneurysm rupture of the combination of different parameters, instead of focusing on only one factor at a time as is frequently found in the literature, using machine learning and feature extraction techniques. This discussion takes relevance in the context of the complex decision that the physicians have to take to decide which therapy to apply, as each intervention bares its own risks, and implies to use a complex ensemble of resources (human resources, OR, etc. in hospitals always under very high work load. This project has been raised in our actual working team, composed of interventional neuroradiologist, radiologic technologist, informatics engineers and biomedical engineers, from Valparaiso public Hospital, Hospital Carlos van Buren, and from Universidad de Valparaíso – Facultad de Ingeniería and Facultad de Medicina. This team has been working together in the last few years, and is now participating in the implementation of an “interdisciplinary platform for innovation in health”, as part of a bigger project leaded by Universidad de Valparaiso (PMI UVA1402. It is relevant to emphasize that this project is made feasible by the existence of this network between physicians and engineers, and by the existence of data already registered in an orderly manner, structured and recorded in digital format. The present proposal arises from the description in nowadays literature that the actual indicators, whether based on morphological description of the aneurysm, or based on characterization of biomechanical factor or others, these indicators were shown not to provide sufficient information in order

  13. Improving building energy modelling by applying advanced 3D surveying techniques on agri-food facilities

    Directory of Open Access Journals (Sweden)

    Francesco Barreca

    2017-09-01

    Full Text Available Food industry is the production sector with the highest energy consumption. In Europe, the energy used to produce food accounts for 26% of total energy consumption. Over 28% is used in industrial processes. Recently, European food companies have increased their efforts to make their production processes more sustainable, also by giving preference to the use of renewable energy sources. In Italy, the total energy consumption in agriculture and food sectors decreased between 2013 and 2014, passing from 16.79 to 13.3 Mtep. Since energy consumption in food industry is nearly twice the one in agriculture (8.57 and 4.73 Mtep, respectively, it is very important to improve energy efficiency and use green technologies in all the phases of food processing and conservation. In Italy, a recent law (Legislative Decree 102, 04/07/2014 has made energy-use diagnosis compulsory for all industrial concerns, particularly for those showing high consumption levels. In the case of food industry buildings, energy is mainly used for indoor microclimate control, which is needed to ensure workers’ wellbeing and the most favourable conditions for food processing and conservation. To this end, it is important to have tools and methods allowing for easy, rapid and precise energy performance assessment of agri-food buildings. The accuracy of the results obtainable from the currently available computational models depends on the grade of detail and information used in constructional and geometric modelling. Moreover, this phase is probably the most critical and time-consuming in the energy diagnosis. In this context, fine surveying and advanced 3D geometric modelling procedures can facilitate building modelling and allow technicians and professionals in the agri-food sector to use highly efficient and accurate energy analysis and evaluation models. This paper proposes a dedicated model for energy performance assessment in agri-food buildings. It also shows that by using

  14. Continuous Wavelet and Hilbert-Huang Transforms Applied for Analysis of Active and Reactive Power Consumption

    Directory of Open Access Journals (Sweden)

    Avdakovic Samir

    2014-08-01

    Full Text Available Analysis of power consumption presents a very important issue for power distribution system operators. Some power system processes such as planning, demand forecasting, development, etc.., require a complete understanding of behaviour of power consumption for observed area, which requires appropriate techniques for analysis of available data. In this paper, two different time-frequency techniques are applied for analysis of hourly values of active and reactive power consumption from one real power distribution transformer substation in urban part of Sarajevo city. Using the continuous wavelet transform (CWT with wavelet power spectrum and global wavelet spectrum some properties of analysed time series are determined. Then, empirical mode decomposition (EMD and Hilbert-Huang Transform (HHT are applied for the analyses of the same time series and the results showed that both applied approaches can provide very useful information about the behaviour of power consumption for observed time interval and different period (frequency bands. Also it can be noticed that the results obtained by global wavelet spectrum and marginal Hilbert spectrum are very similar, thus confirming that both approaches could be used for identification of main properties of active and reactive power consumption time series.

  15. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  16. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis.

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-03-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches.

  17. Water spray cooling technique applied on a photovoltaic panel: The performance response

    International Nuclear Information System (INIS)

    Nižetić, S.; Čoko, D.; Yadav, A.; Grubišić-Čabo, F.

    2016-01-01

    Highlights: • An experimental study was conducted on a monocrystalline photovoltaic panel (PV). • A water spray cooling technique was implemented to determine PV panel response. • The experimental results showed favorable cooling effect on the panel performance. • A feasibility aspect of the water spray cooling technique was also proven. - Abstract: This paper presents an alternative cooling technique for photovoltaic (PV) panels that includes a water spray application over panel surfaces. An alternative cooling technique in the sense that both sides of the PV panel were cooled simultaneously, to investigate the total water spray cooling effect on the PV panel performance in circumstances of peak solar irradiation levels. A specific experimental setup was elaborated in detail and the developed cooling system for the PV panel was tested in a geographical location with a typical Mediterranean climate. The experimental result shows that it is possible to achieve a maximal total increase of 16.3% (effective 7.7%) in electric power output and a total increase of 14.1% (effective 5.9%) in PV panel electrical efficiency by using the proposed cooling technique in circumstances of peak solar irradiation. Furthermore, it was also possible to decrease panel temperature from an average 54 °C (non-cooled PV panel) to 24 °C in the case of simultaneous front and backside PV panel cooling. Economic feasibility was also determined for of the proposed water spray cooling technique, where the main advantage of the analyzed cooling technique is regarding the PV panel’s surface and its self-cleaning effect, which additionally acts as a booster to the average delivered electricity.

  18. Trend Filtering Techniques for Time Series Analysis

    OpenAIRE

    López Arias, Daniel

    2016-01-01

    Time series can be found almost everywhere in our lives and because of this being capable of analysing them is an important task. Most of the time series we can think of are quite noisy, being this one of the main problems to extract information from them. In this work we use Trend Filtering techniques to try to remove this noise from a series and understand the underlying trend of the series, that gives us information about the behaviour of the series aside from the particular...

  19. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  20. Dynamic speckle analysis using multivariate techniques

    International Nuclear Information System (INIS)

    López-Alonso, José M; Alda, Javier; Rabal, Héctor; Grumel, Eduardo; Trivi, Marcelo

    2015-01-01

    In this work we use principal components analysis to characterize dynamic speckle patterns. This analysis quantitatively identifies different dynamics that could be associated to physical phenomena occurring in the sample. We also found the contribution explained by each principal component, or by a group of them. The method analyzes the paint drying process over a hidden topography. It can be used for fast screening and identification of different dynamics in biological or industrial samples by means of dynamic speckle interferometry. (paper)

  1. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression

    Directory of Open Access Journals (Sweden)

    Land Walker H

    2011-01-01

    Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  2. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression.

    Science.gov (United States)

    Heine, John J; Land, Walker H; Egan, Kathleen M

    2011-01-27

    When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL) techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR) modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  3. Synchrotron and Simulations Techniques Applied to Problems in Materials Science: Catalysts and Azul Maya Pigments

    International Nuclear Information System (INIS)

    Chianelli, R.

    2005-01-01

    Development of synchrotron techniques for the determination of the structure of disordered, amorphous and surface materials has exploded over the past twenty years due to the increasing availability of high flux synchrotron radiation and the continuing development of increasingly powerful synchrotron techniques. These techniques are available to materials scientists who are not necessarily synchrotron scientists through interaction with effective user communities that exist at synchrotrons such as the Stanford Synchrotron Radiation Laboratory (SSRL). In this article we review the application of multiple synchrotron characterization techniques to two classes of materials defined as ''surface compounds.'' One class of surface compounds are materials like MoS 2-x C x that are widely used petroleum catalysts used to improve the environmental properties of transportation fuels. These compounds may be viewed as ''sulfide supported carbides'' in their catalytically active states. The second class of ''surface compounds'' is the ''Maya Blue'' pigments that are based on technology created by the ancient Maya. These compounds are organic/inorganic ''surface complexes'' consisting of the dye indigo and palygorskite, a common clay. The identification of both surface compounds relies on the application of synchrotron techniques as described in this report

  4. Synchroton and Simulations Techniques Applied to Problems in Materials Science: Catalysts and Azul Maya Pigments

    Energy Technology Data Exchange (ETDEWEB)

    Chianelli, R.

    2005-01-12

    Development of synchrotron techniques for the determination of the structure of disordered, amorphous and surface materials has exploded over the past twenty years due to the increasing availability of high flux synchrotron radiation and the continuing development of increasingly powerful synchrotron techniques. These techniques are available to materials scientists who are not necessarily synchrotron scientists through interaction with effective user communities that exist at synchrotrons such as the Stanford Synchrotron Radiation Laboratory (SSRL). In this article we review the application of multiple synchrotron characterization techniques to two classes of materials defined as ''surface compounds.'' One class of surface compounds are materials like MoS{sub 2-x}C{sub x} that are widely used petroleum catalysts used to improve the environmental properties of transportation fuels. These compounds may be viewed as ''sulfide supported carbides'' in their catalytically active states. The second class of ''surface compounds'' is the ''Maya Blue'' pigments that are based on technology created by the ancient Maya. These compounds are organic/inorganic ''surface complexes'' consisting of the dye indigo and palygorskite, a common clay. The identification of both surface compounds relies on the application of synchrotron techniques as described in this report.

  5. Feasibility to apply the steam assisted gravity drainage (SAGD) technique in the country's heavy crude-oil fields

    International Nuclear Information System (INIS)

    Rodriguez, Edwin; Orjuela, Jaime

    2004-01-01

    The steam assisted gravity drainage (SAGD) processes are one of the most efficient and profitable technologies for the production of heavy crude oils and oil sands. These processes involve the drilling of a couple of parallel horizontal wells, separated by a vertical distance and located near the oil field base. The upper well is used to continuously inject steam into the zone of interest, while the lower well collects all resulting fluids (oil, condensate and formation water) and takes them to the surface (Butler, 1994). This technology has been successfully implemented in countries such as Canada, Venezuela and United States, reaching recovery factors in excess of 50%. This article provides an overview of the technique's operation mechanism and the process most relevant characteristics, as well as the various categories this technology is divided into, including all its advantages and limitations. Furthermore, the article sets the oil field's minimal conditions under which the SAGD process is efficient, which conditions, as integrated to a series of mathematical models, allow to make forecasts on production, thermal efficiency (ODR) and oil to be recovered, as long as it is feasible (from a technical point of view) to apply this technique to a defined oil field. The information and concepts compiled during this research prompted the development of software, which may be used as an information, analysis and interpretation tool to predict and quantify this technology's performance. Based on the article, preliminary studies were started for the country's heavy crude-oil fields, identifying which provide the minimum conditions for the successful development of a pilot project

  6. The Evidence-Based Practice of Applied Behavior Analysis.

    Science.gov (United States)

    Slocum, Timothy A; Detrich, Ronnie; Wilczynski, Susan M; Spencer, Trina D; Lewis, Teri; Wolfe, Katie

    2014-05-01

    Evidence-based practice (EBP) is a model of professional decision-making in which practitioners integrate the best available evidence with client values/context and clinical expertise in order to provide services for their clients. This framework provides behavior analysts with a structure for pervasive use of the best available evidence in the complex settings in which they work. This structure recognizes the need for clear and explicit understanding of the strength of evidence supporting intervention options, the important contextual factors including client values that contribute to decision making, and the key role of clinical expertise in the conceptualization, intervention, and evaluation of cases. Opening the discussion of EBP in this journal, Smith (The Behavior Analyst, 36, 7-33, 2013) raised several key issues related to EBP and applied behavior analysis (ABA). The purpose of this paper is to respond to Smith's arguments and extend the discussion of the relevant issues. Although we support many of Smith's (The Behavior Analyst, 36, 7-33, 2013) points, we contend that Smith's definition of EBP is significantly narrower than definitions that are used in professions with long histories of EBP and that this narrowness conflicts with the principles that drive applied behavior analytic practice. We offer a definition and framework for EBP that aligns with the foundations of ABA and is consistent with well-established definitions of EBP in medicine, psychology, and other professions. In addition to supporting the systematic use of research evidence in behavior analytic decision making, this definition can promote clear communication about treatment decisions across disciplines and with important outside institutions such as insurance companies and granting agencies.

  7. Applied techniques for high bandwidth data transfers across wide area networks

    International Nuclear Information System (INIS)

    Lee, Jason; Gunter, Dan; Tierney, Brian; Allcock, Bill; Bester, Joe; Bresnahan, John; Tuecke, Steve

    2001-01-01

    Large distributed systems such as Computational/Data Grids require large amounts of data to be co-located with the computing facilities for processing. Ensuring that the data is there in time for the computation in today's Internet is a massive problem. From our work developing a scalable distributed network cache, we have gained experience with techniques necessary to achieve high data throughput over high bandwidth Wide Area Networks (WAN). In this paper, we discuss several hardware and software design techniques and issues, and then describe their application to an implementation of an enhanced FTP protocol called GridFTP. We also describe results from two applications using these techniques, which were obtained at the Supercomputing 2000 conference

  8. A Survey on Data Mining Techniques Applied to Electricity-Related Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Francisco Martínez-Álvarez

    2015-11-01

    Full Text Available Data mining has become an essential tool during the last decade to analyze large sets of data. The variety of techniques it includes and the successful results obtained in many application fields, make this family of approaches powerful and widely used. In particular, this work explores the application of these techniques to time series forecasting. Although classical statistical-based methods provides reasonably good results, the result of the application of data mining outperforms those of classical ones. Hence, this work faces two main challenges: (i to provide a compact mathematical formulation of the mainly used techniques; (ii to review the latest works of time series forecasting and, as case study, those related to electricity price and demand markets.

  9. Spherical harmonics based intrasubject 3-D kidney modeling/registration technique applied on partial information

    Science.gov (United States)

    Dillenseger, Jean-Louis; Guillaume, Hélène; Patard, Jean-Jacques

    2006-01-01

    This paper presents a 3D shape reconstruction/intra-patient rigid registration technique used to establish a Nephron-Sparing Surgery preoperative planning. The usual preoperative imaging system is the Spiral CT Urography, which provides successive 3D acquisitions of complementary information on kidney anatomy. Because the kidney is difficult to demarcate from the liver or from the spleen only limited information on its volume or surface is available. In our paper we propose a methodology allowing a global kidney spatial representation on a spherical harmonics basis. The spherical harmonics are exploited to recover the kidney 3D shape and also to perform intra-patient 3D rigid registration. An evaluation performed on synthetic data showed that this technique presented lower performance then expected for the 3D shape recovering but exhibited registration results slightly more accurate as the ICP technique with faster computation time. PMID:17073323

  10. Reformulation linearization technique based branch-and-reduce approach applied to regional water supply system planning

    Science.gov (United States)

    Lan, Fujun; Bayraksan, Güzin; Lansey, Kevin

    2016-03-01

    A regional water supply system design problem that determines pipe and pump design parameters and water flows over a multi-year planning horizon is considered. A non-convex nonlinear model is formulated and solved by a branch-and-reduce global optimization approach. The lower bounding problem is constructed via a three-pronged effort that involves transforming the space of certain decision variables, polyhedral outer approximations, and the Reformulation Linearization Technique (RLT). Range reduction techniques are employed systematically to speed up convergence. Computational results demonstrate the efficiency of the proposed algorithm; in particular, the critical role range reduction techniques could play in RLT based branch-and-bound methods. Results also indicate using reclaimed water not only saves freshwater sources but is also a cost-effective non-potable water source in arid regions. Supplemental data for this article can be accessed at http://dx.doi.org/10.1080/0305215X.2015.1016508.

  11. Effect of the reinforcement bar arrangement on the efficiency of electrochemical chloride removal technique applied to reinforced concrete structures

    International Nuclear Information System (INIS)

    Garces, P.; Sanchez de Rojas, M.J.; Climent, M.A.

    2006-01-01

    This paper reports on the research done to find out the effect that different bar arrangements may have on the efficiency of the electrochemical chloride removal (ECR) technique when applied to a reinforced concrete structural member. Five different types of bar arrangements were considered, corresponding to typical structural members such as columns (with single and double bar reinforcing), slabs, beams and footings. ECR was applied in several steps. We observe that the extraction efficiency depends on the reinforcing bar arrangement. A uniform layer set-up favours chloride extraction. Electrochemical techniques were also used to estimate the reinforcing bar corrosion states, as well as measure the corrosion potential, and instant corrosion rate based on the polarization resistance technique. After ECR treatment, a reduction in the corrosion levels is observed falling short of the depassivation threshold

  12. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  13. 48 CFR 215.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for...

  14. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... ensure a fair and reasonable price. Examples of such techniques include, but are not limited to, the... to the cost or price analysis of the service or product being proposed should also be included in the... techniques. (a) General. The objective of proposal analysis is to ensure that the final agreed-to price is...

  15. Digital filtering techniques applied to electric power systems protection; Tecnicas de filtragem digital aplicadas a protecao de sistemas eletricos de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Brito, Helio Glauco Ferreira

    1996-12-31

    This work introduces an analysis and a comparative study of some of the techniques for digital filtering of the voltage and current waveforms from faulted transmission lines. This study is of fundamental importance for the development of algorithms applied to digital protection of electric power systems. The techniques studied are based on the Discrete Fourier Transform theory, the Walsh functions and the Kalman filter theory. Two aspects were emphasized in this study: Firstly, the non-recursive techniques were analysed with the implementation of filters based on Fourier theory and the Walsh functions. Secondly, recursive techniques were analyzed, with the implementation of the filters based on the Kalman theory and once more on the Fourier theory. (author) 56 refs., 25 figs., 16 tabs.

  16. Applied predictive analytics principles and techniques for the professional data analyst

    CERN Document Server

    Abbott, Dean

    2014-01-01

    Learn the art and science of predictive analytics - techniques that get results Predictive analytics is what translates big data into meaningful, usable business information. Written by a leading expert in the field, this guide examines the science of the underlying algorithms as well as the principles and best practices that govern the art of predictive analytics. It clearly explains the theory behind predictive analytics, teaches the methods, principles, and techniques for conducting predictive analytics projects, and offers tips and tricks that are essential for successful predictive mode

  17. Applied techniques for high bandwidth data transfers across wide area networks

    International Nuclear Information System (INIS)

    Lee, J.; Gunter, D.; Tierney, B.; Allcock, B.; Bester, J.; Bresnahan, J.; Tuecke, S.

    2001-01-01

    Large distributed systems such as Computational/Data Grids require large amounts of data to be co-located with the computing facilities for processing. From their work developing a scalable distributed network cache, the authors have gained experience with techniques necessary to achieve high data throughput over high bandwidth Wide Area Networks (WAN). The authors discuss several hardware and software design techniques, and then describe their application to an implementation of an enhanced FTP protocol called GridFTP. The authors describe results from the Supercomputing 2000 conference

  18. Monitoring gypsy moth defoliation by applying change detection techniques to Landsat imagery

    Science.gov (United States)

    Williams, D. L.; Stauffer, M. L.

    1978-01-01

    The overall objective of a research effort at NASA's Goddard Space Flight Center is to develop and evaluate digital image processing techniques that will facilitate the assessment of the intensity and spatial distribution of forest insect damage in Northeastern U.S. forests using remotely sensed data from Landsats 1, 2 and C. Automated change detection techniques are presently being investigated as a method of isolating the areas of change in the forest canopy resulting from pest outbreaks. In order to follow the change detection approach, Landsat scene correction and overlay capabilities are utilized to provide multispectral/multitemporal image files of 'defoliation' and 'nondefoliation' forest stand conditions.

  19. People Recognition for Loja ECU911 applying artificial vision techniques

    Directory of Open Access Journals (Sweden)

    Diego Cale

    2016-05-01

    Full Text Available This article presents a technological proposal based on artificial vision which aims to search people in an intelligent way by using IP video cameras. Currently, manual searching process is time and resource demanding in contrast to automated searching one, which means that it could be replaced. In order to obtain optimal results, three different techniques of artificial vision were analyzed (Eigenfaces, Fisherfaces, Local Binary Patterns Histograms. The selection process considered factors like lighting changes, image quality and changes in the angle of focus of the camera. Besides, a literature review was conducted to evaluate several points of view regarding artificial vision techniques.

  20. U P1, an example for advanced techniques applied to high level activity dismantling

    International Nuclear Information System (INIS)

    Michel-Noel, M.; Calixte, O.; Blanchard, S.; Bani, J.; Girones, P.; Moitrier, C.; Terry, G.; Bourdy, R.

    2014-01-01

    The U P1 plant on the CEA Marcoule site was dedicated to the processing of spend fuels from the G1, G2 and G3 plutonium-producing reactors. This plant represents 20.000 m 2 of workshops housing about 1000 hot cells. In 1998, a huge program for the dismantling and cleaning-up of the UP1 plant was launched. CEA has developed new techniques to face the complexity of the dismantling operations. These techniques include immersive virtual reality, laser cutting, a specific manipulator arm called MAESTRO and remote handling. (A.C.)

  1. Assessment of maceration techniques used to remove soft tissue from bone in cut mark analysis.

    Science.gov (United States)

    King, Christine; Birch, Wendy

    2015-01-01

    Maceration techniques employed in forensics must be effective without compromising the bone's integrity and morphology, and prevent destruction of evidence. Techniques must also be fast, safe, easily obtainable and inexpensive; not all techniques currently employed are appropriate for forensic use. To evaluate the most suitable approach, seven techniques including current and new methodologies were applied to fresh, fleshed porcine ribs exhibiting cut marks. A sample size of 30 specimens per technique was examined under scanning electron microscopy at the cut mark and the surrounding uncompromised regions; a scoring system of effectiveness was applied. The previously unpublished microwave method fared best for bone and cut mark preservation. Sodium hypochlorite destroyed cut marks, and was deemed unsuitable for forensic analysis. No single technique fulfilled all criteria; however, this study provides a benchmark for forensic anthropologists to select the most appropriate method for their situation, while maintaining the high standards required by forensic science. © 2015 American Academy of Forensic Sciences.

  2. 3D-QSPR Method of Computational Technique Applied on Red Reactive Dyes by Using CoMFA Strategy

    Directory of Open Access Journals (Sweden)

    Shahnaz Perveen

    2011-12-01

    Full Text Available Cellulose fiber is a tremendous natural resource that has broad application in various productions including the textile industry. The dyes, which are commonly used for cellulose printing, are “reactive dyes” because of their high wet fastness and brilliant colors. The interaction of various dyes with the cellulose fiber depends upon the physiochemical properties that are governed by specific features of the dye molecule. The binding pattern of the reactive dye with cellulose fiber is called the ligand-receptor concept. In the current study, the three dimensional quantitative structure property relationship (3D-QSPR technique was applied to understand the red reactive dyes interactions with the cellulose by the Comparative Molecular Field Analysis (CoMFA method. This method was successfully utilized to predict a reliable model. The predicted model gives satisfactory statistical results and in the light of these, it was further analyzed. Additionally, the graphical outcomes (contour maps help us to understand the modification pattern and to correlate the structural changes with respect to the absorptivity. Furthermore, the final selected model has potential to assist in understanding the charachteristics of the external test set. The study could be helpful to design new reactive dyes with better affinity and selectivity for the cellulose fiber.

  3. 3D-QSPR method of computational technique applied on red reactive dyes by using CoMFA strategy.

    Science.gov (United States)

    Mahmood, Uzma; Rashid, Sitara; Ali, S Ishrat; Parveen, Rasheeda; Zaheer-Ul-Haq; Ambreen, Nida; Khan, Khalid Mohammed; Perveen, Shahnaz; Voelter, Wolfgang

    2011-01-01

    Cellulose fiber is a tremendous natural resource that has broad application in various productions including the textile industry. The dyes, which are commonly used for cellulose printing, are "reactive dyes" because of their high wet fastness and brilliant colors. The interaction of various dyes with the cellulose fiber depends upon the physiochemical properties that are governed by specific features of the dye molecule. The binding pattern of the reactive dye with cellulose fiber is called the ligand-receptor concept. In the current study, the three dimensional quantitative structure property relationship (3D-QSPR) technique was applied to understand the red reactive dyes interactions with the cellulose by the Comparative Molecular Field Analysis (CoMFA) method. This method was successfully utilized to predict a reliable model. The predicted model gives satisfactory statistical results and in the light of these, it was further analyzed. Additionally, the graphical outcomes (contour maps) help us to understand the modification pattern and to correlate the structural changes with respect to the absorptivity. Furthermore, the final selected model has potential to assist in understanding the characteristics of the external test set. The study could be helpful to design new reactive dyes with better affinity and selectivity for the cellulose fiber.

  4. 3D-QSPR Method of Computational Technique Applied on Red Reactive Dyes by Using CoMFA Strategy

    Science.gov (United States)

    Mahmood, Uzma; Rashid, Sitara; Ali, S. Ishrat; Parveen, Rasheeda; Zaheer-ul-Haq; Ambreen, Nida; Khan, Khalid Mohammed; Perveen, Shahnaz; Voelter, Wolfgang

    2011-01-01

    Cellulose fiber is a tremendous natural resource that has broad application in various productions including the textile industry. The dyes, which are commonly used for cellulose printing, are “reactive dyes” because of their high wet fastness and brilliant colors. The interaction of various dyes with the cellulose fiber depends upon the physiochemical properties that are governed by specific features of the dye molecule. The binding pattern of the reactive dye with cellulose fiber is called the ligand-receptor concept. In the current study, the three dimensional quantitative structure property relationship (3D-QSPR) technique was applied to understand the red reactive dyes interactions with the cellulose by the Comparative Molecular Field Analysis (CoMFA) method. This method was successfully utilized to predict a reliable model. The predicted model gives satisfactory statistical results and in the light of these, it was further analyzed. Additionally, the graphical outcomes (contour maps) help us to understand the modification pattern and to correlate the structural changes with respect to the absorptivity. Furthermore, the final selected model has potential to assist in understanding the charachteristics of the external test set. The study could be helpful to design new reactive dyes with better affinity and selectivity for the cellulose fiber. PMID:22272108

  5. UQ and V&V techniques applied to experiments and simulations of heated pipes pressurized to failure

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dempsey, J. Franklin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Antoun, Bonnie R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    This report demonstrates versatile and practical model validation and uncertainty quantification techniques applied to the accuracy assessment of a computational model of heated steel pipes pressurized to failure. The Real Space validation methodology segregates aleatory and epistemic uncertainties to form straightforward model validation metrics especially suited for assessing models to be used in the analysis of performance and safety margins. The methodology handles difficulties associated with representing and propagating interval and/or probabilistic uncertainties from multiple correlated and uncorrelated sources in the experiments and simulations including: material variability characterized by non-parametric random functions (discrete temperature dependent stress-strain curves); very limited (sparse) experimental data at the coupon testing level for material characterization and at the pipe-test validation level; boundary condition reconstruction uncertainties from spatially sparse sensor data; normalization of pipe experimental responses for measured input-condition differences among tests and for random and systematic uncertainties in measurement/processing/inference of experimental inputs and outputs; numerical solution uncertainty from model discretization and solver effects.

  6. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    Science.gov (United States)

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  7. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  8. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  9. A Technical Review of Electrochemical Techniques Applied to Microbiologically Influenced Corrosion

    Science.gov (United States)

    1991-01-01

    in the literature for the study of MIC phenomena. Videla65 has used this technique in a study of the action of Cladosporium resinae growth on the...ROSALES, Corrosion 44, 638 (1988). 65. H. A. VIDs, The action of Clado.sporiuo resinae growth on the electrochemical behavior of aluminum. Proc. bit. Conf

  10. Reduced order modelling techniques for mesh movement strategies as applied to fluid structure interactions

    CSIR Research Space (South Africa)

    Bogaers, Alfred EJ

    2010-01-01

    Full Text Available In this paper, we implement the method of Proper Orthogonal Decomposition (POD) to generate a reduced order model (ROM) of an optimization based mesh movement technique. In the study it is shown that POD can be used effectively to generate a ROM...

  11. MSC/NASTRAN ''expert'' techniques developed and applied to the TFTR poloidal field coils

    International Nuclear Information System (INIS)

    O'Toole, J.A.

    1986-01-01

    The TFTR poloidal field (PF) coils are being analyzed by PPPL and Grumman using MSC/NASTRAN as a part of an overall effort to establish the absolute limiting conditions of operation for TFTR. Each of the PF coils will be analyzed in depth, using a detailed set of finite element models. Several of the models developed are quite large because each copper turn, as well as its surrounding insulation, was modeled using solid elements. Several of the finite element models proved large enough to tax the capabilities of the National Magnetic Fusion Energy Computer Center (NMFECC), specifically disk storage space. To allow the use of substructuring techniques with their associated data bases for the larger models, it became necessary to employ certain infrequently used MSC/NASTRAN ''expert'' techniques. The techniques developed used multiple data bases and data base sets to divide each problem into a series of computer runs. For each run, only the data required was kept on active disk space, the remainder being placed in inactive ''FILEM'' storage, thus, minimizing active disk space required at any time and permitting problem solution using the NMFECC. A representative problem using the TFTR OH-1 coil global model provides an example of the techniques developed. The special considerations necessary to obtain proper results are discussed

  12. Urban field guide: applying social forestry observation techniques to the east coast megalopolis

    Science.gov (United States)

    E. Svendsen; V. Marshall; M.F. Ufer

    2006-01-01

    A changing economy and different lifestyles have altered the meaning of the forest in the northeastern United States, prompting scientists to reconsider the spatial form, stewardship and function of the urban forest. The Authors describe how social observation techniques and the employment of a novel, locally based, participatory hand-held monitoring system could aid...

  13. Practising What We Teach: Vocational Teachers Learn to Research through Applying Action Learning Techniques

    Science.gov (United States)

    Lasky, Barbara; Tempone, Irene

    2004-01-01

    Action learning techniques are well suited to the teaching of organisation behaviour students because of their flexibility, inclusiveness, openness, and respect for individuals. They are no less useful as a tool for change for vocational teachers, learning, of necessity, to become researchers. Whereas traditional universities have always had a…

  14. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  15. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    features in the speech process: (i) the resonant structure of the vocal-tract transfer function, i.e, formant analysis,. (ii) the glottal wave,. (iii) the fundamental frequency or pitch of the sound. During the production of speech, the configuration of the articulators: the vocal tract tongue, teeth, lips, etc, changes from one sound to.

  16. Do trained practice nurses apply motivational interviewing techniques in primary care consultations?

    NARCIS (Netherlands)

    Noordman, J.; Lee, I. van der; Nielen, M.; Vlek, H.; Weijden, T. van der; Dulmen, S. van

    2012-01-01

    Background: Reducing the prevalence of unhealthy lifestyle behaviour could positively influence health. Motivational interviewing (MI) is used to promote change in unhealthy lifestyle behaviour as part of primary or secondary prevention. Whether MI is actually applied as taught is unknown. Practice

  17. Do trained practice nurses apply motivational interviewing techniques in primary care consultations?

    NARCIS (Netherlands)

    Noordman, J.; van Lee, I.; Nielen, M.; Vlek, H.; van Weijden, T.; Dulmen, A.M. van

    2012-01-01

    BACKGROUND: Reducing the prevalence of unhealthy lifestyle behaviour could positively influence health. Motivational interviewing (MI) is used to promote change in unhealthy lifestyle behaviour as part of primary or secondary prevention. Whether MI is actually applied as taught is unknown. Practice

  18. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Science.gov (United States)

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  19. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Directory of Open Access Journals (Sweden)

    Ted von Hippel

    Full Text Available We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  20. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  1. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  2. Applied genre analysis: a multi-perspective model

    Directory of Open Access Journals (Sweden)

    Vijay K Bhatia

    2002-04-01

    Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

  3. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  4. Topological data analysis (TDA) applied to reveal pedogenetic principles of European topsoil system.

    Science.gov (United States)

    Savic, Aleksandar; Toth, Gergely; Duponchel, Ludovic

    2017-05-15

    Recent developments in applied mathematics are bringing new tools that are capable to synthesize knowledge in various disciplines, and help in finding hidden relationships between variables. One such technique is topological data analysis (TDA), a fusion of classical exploration techniques such as principal component analysis (PCA), and a topological point of view applied to clustering of results. Various phenomena have already received new interpretations thanks to TDA, from the proper choice of sport teams to cancer treatments. For the first time, this technique has been applied in soil science, to show the interaction between physical and chemical soil attributes and main soil-forming factors, such as climate and land use. The topsoil data set of the Land Use/Land Cover Area Frame survey (LUCAS) was used as a comprehensive database that consists of approximately 20,000 samples, each described by 12 physical and chemical parameters. After the application of TDA, results obtained were cross-checked against known grouping parameters including five types of land cover, nine types of climate and the organic carbon content of soil. Some of the grouping characteristics observed using standard approaches were confirmed by TDA (e.g., organic carbon content) but novel subtle relationships (e.g., magnitude of anthropogenic effect in soil formation), were discovered as well. The importance of this finding is that TDA is a unique mathematical technique capable of extracting complex relations hidden in soil science data sets, giving the opportunity to see the influence of physicochemical, biotic and abiotic factors on topsoil formation through fresh eyes. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  6. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  7. Parameters and definitions in applied technique quality test for nuclear magnetic resonance imaging system (NMRI)

    International Nuclear Information System (INIS)

    Lin Zhikai; Zhao Lancai

    1999-08-01

    During the past two decades, medical diagnostic imaging technique has achieved dramatic development such as CT, MRI, PET, DSA and so on. The most striking examples of them are the application of X ray computerized tomography (CT) and magnetic resonance imaging in the field of medical diagnosis. It can be predicted that magnetic resonance imaging (MRI) will definitely have more widespread prospects of applications and play more and more important role in clinical diagnosis looking forward to the development of image diagnostic technique for 21 st century. The authors also present the measuring methods for some parameters. The parameters described can be used for reference by clinical diagnosticians, operators on MRI and medical physicists who engages in image quality assurance (QA) and control (QC) in performing MRI acceptance test and routine test

  8. Magnetic resonance techniques applied to the diagnosis and treatment of Parkinson’s disease

    Directory of Open Access Journals (Sweden)

    Benito eDe Celis Alonso

    2015-07-01

    Full Text Available Parkinson’s disease affects at least 10 million people worldwide. It is a neurodegenerative disease which is currently diagnosed by neurological examination. No neuroimaging investigation or blood biomarker is available to aid diagnosis and prognosis. Most effort toward diagnosis using magnetic resonance has been focused on the use of structural/anatomical neuroimaging and diffusion tensor imaging. However, deep brain stimulation, a current strategy for treating Parkinson’s disease, is guided by magnetic resonance imaging. For clinical prognosis, diagnosis and follow-up investigations, blood oxygen level–dependent magnetic resonance imaging, diffusion tensor imaging, spectroscopy and transcranial magnetic stimulation have been used. These techniques represent the state of the art in the last five years. Here, we focus on magnetic resonance techniques for the diagnosis and treatment of Parkinson’s disease.

  9. The photoluminescence technique applied to the investigation of structural imperfections in quantum wells of semiconducting material

    Directory of Open Access Journals (Sweden)

    Eliermes Arraes Meneses

    2005-02-01

    Full Text Available Photoluminescence is one of the most used spectroscopy techniques for the study of the optical properties of semiconducting materials and heterostructures. In this work the potentiality of this technique is explored through the investigation and characterization of structural imperfections originated from fluctuations in the chemical composition of ternary and quaternary alloys, from interface roughnesses, and from unintentional compounds formed by the chemical elements intermixing at the interfaces. Samples of GaAs/AlGaAs, GaAsSb/GaAs, GaAsSbN/GaAs and GaAs/GaInP quantum well structures are analyzed to verify the influence of the structural imperfections on the PL spectra

  10. Artificial intelligence techniques applied to hourly global irradiance estimation from satellite-derived cloud index

    Energy Technology Data Exchange (ETDEWEB)

    Zarzalejo, L.F.; Ramirez, L.; Polo, J. [DER-CIEMAT, Madrid (Spain). Renewable Energy Dept.

    2005-07-01

    Artificial intelligence techniques, such as fuzzy logic and neural networks, have been used for estimating hourly global radiation from satellite images. The models have been fitted to measured global irradiance data from 15 Spanish terrestrial stations. Both satellite imaging data and terrestrial information from the years 1994, 1995 and 1996 were used. The results of these artificial intelligence models were compared to a multivariate regression based upon Heliosat I model. A general better behaviour was observed for the artificial intelligence models. (author)

  11. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    Science.gov (United States)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  12. Artificial intelligence techniques applied to hourly global irradiance estimation from satellite-derived cloud index

    International Nuclear Information System (INIS)

    Zarzalejo, Luis F.; Ramirez, Lourdes; Polo, Jesus

    2005-01-01

    Artificial intelligence techniques, such as fuzzy logic and neural networks, have been used for estimating hourly global radiation from satellite images. The models have been fitted to measured global irradiance data from 15 Spanish terrestrial stations. Both satellite imaging data and terrestrial information from the years 1994, 1995 and 1996 were used. The results of these artificial intelligence models were compared to a multivariate regression based upon Heliosat I model. A general better behaviour was observed for the artificial intelligence models

  13. The Ecological Profiles Technique applied to data from Lichtenburg, South Africa

    Directory of Open Access Journals (Sweden)

    J. W. Morris

    1974-12-01

    Full Text Available The method of ecological profiles and information shared between species and ecological variables, developed in France, is described for the first time in English. Preliminary results, using the technique on Bankenveld quadrat data from Lichtenburg, Western Transvaal, are given. It is concluded that the method has great potential value for the understanding of the autecology of South African species provided that the sampling method is appropriate.

  14. Improving throughput and user experience for information intensive websites by applying HTTP compression technique.

    Science.gov (United States)

    Malla, Ratnakar

    2008-11-06

    HTTP compression is a technique specified as part of the W3C HTTP 1.0 standard. It allows HTTP servers to take advantage of GZIP compression technology that is built into latest browsers. A brief survey of medical informatics websites show that compression is not enabled. With compression enabled, downloaded files sizes are reduced by more than 50% and typical transaction time is also reduced from 20 to 8 minutes, thus providing a better user experience.

  15. Acoustic Emission and Echo Signal Compensation Techniques Applied to an Ultrasonic Logging-While-Drilling Caliper.

    Science.gov (United States)

    Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong

    2017-06-10

    A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved.

  16. An efficient permeability scaling-up technique applied to the discretized flow equations

    Energy Technology Data Exchange (ETDEWEB)

    Urgelli, D.; Ding, Yu [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    Grid-block permeability scaling-up for numerical reservoir simulations has been discussed for a long time in the literature. It is now recognized that a full permeability tensor is needed to get an accurate reservoir description at large scale. However, two major difficulties are encountered: (1) grid-block permeability cannot be properly defined because it depends on boundary conditions; (2) discretization of flow equations with a full permeability tensor is not straightforward and little work has been done on this subject. In this paper, we propose a new method, which allows us to get around both difficulties. As the two major problems are closely related, a global approach will preserve the accuracy. So, in the proposed method, the permeability up-scaling technique is integrated in the discretized numerical scheme for flow simulation. The permeability is scaled-up via the transmissibility term, in accordance with the fluid flow calculation in the numerical scheme. A finite-volume scheme is particularly studied, and the transmissibility scaling-up technique for this scheme is presented. Some numerical examples are tested for flow simulation. This new method is compared with some published numerical schemes for full permeability tensor discretization where the full permeability tensor is scaled-up through various techniques. Comparing the results with fine grid simulations shows that the new method is more accurate and more efficient.

  17. Applying the sterile insect technique to the control of insect pests

    International Nuclear Information System (INIS)

    LaChance, L.E.; Klassen, W.

    1991-01-01

    The sterile insect technique involves the mass-rearing of insects, which are sterilized by gamma rays from a 60 Co source before being released in a controlled fashion into nature. Matings between the sterile insects released and native insects produce no progeny, and so if enough of these matings occur the pest population can be controlled or even eradicated. A modification of the technique, especially suitable for the suppression of the moths and butterflies, is called the F, or inherited sterility method. In this, lower radiation doses are used such that the released males are only partially sterile (30-60%) and the females are fully sterile. When released males mate with native females some progeny are produced, but they are completely sterile. Thus, full expression of the sterility is delayed by one generation. This article describes the use of the sterile insect technique in controlling the screwworm fly, the tsetse fly, the medfly, the pink bollworm and the melon fly, and of the F 1 sterility method in the eradication of local gypsy moth infestations. 18 refs, 5 figs, 1 tab

  18. Applying Data-mining techniques to study drought periods in Spain

    Science.gov (United States)

    Belda, F.; Penades, M. C.

    2010-09-01

    Data-mining is a technique that it can be used to interact with large databases and to help in the discovery relations between parameters by extracting information from massive and multiple data archives. Drought affects many economic and social sectors, from agricultural to transportation, going through urban water deficit and the development of modern industries. With these problems and drought geographical and temporal distribution it's difficult to find a single definition of drought. Improving the understanding of the knowledge of climatic index is necessary to reduce the impacts of drought and to facilitate quick decisions regarding this problem. The main objective is to analyze drought periods from 1950 to 2009 in Spain. We use several kinds of information, different formats, sources and transmission mode. We use satellite-based Vegetation Index, dryness index for several temporal periods. We use daily and monthly precipitation and temperature data and soil moisture data from numerical weather model. We calculate mainly Standardized Precipitation Index (SPI) that it has been used amply in the bibliography. We use OLAP-Mining techniques to discovery of association rules between remote-sensing, numerical weather model and climatic index. Time series Data- Mining techniques organize data as a sequence of events, with each event having a time of recurrence, to cluster the data into groups of records or cluster with similar characteristics. Prior climatological classification is necessary if we want to study drought periods over all Spain.

  19. Applying the Wizard-of-Oz Technique to Multimodal Human-Robot Dialogue

    OpenAIRE

    Marge, Matthew; Bonial, Claire; Byrne, Brendan; Cassidy, Taylor; Evans, A. William; Hill, Susan G.; Voss, Clare

    2017-01-01

    Our overall program objective is to provide more natural ways for soldiers to interact and communicate with robots, much like how soldiers communicate with other soldiers today. We describe how the Wizard-of-Oz (WOz) method can be applied to multimodal human-robot dialogue in a collaborative exploration task. While the WOz method can help design robot behaviors, traditional approaches place the burden of decisions on a single wizard. In this work, we consider two wizards to stand in for robot...

  20. Assessment of ground-based monitoring techniques applied to landslide investigations

    Science.gov (United States)

    Uhlemann, S.; Smith, A.; Chambers, J.; Dixon, N.; Dijkstra, T.; Haslam, E.; Meldrum, P.; Merritt, A.; Gunn, D.; Mackay, J.

    2016-01-01

    A landslide complex in the Whitby Mudstone Formation at Hollin Hill, North Yorkshire, UK is periodically re-activated in response to rainfall-induced pore-water pressure fluctuations. This paper compares long-term measurements (i.e., 2009-2014) obtained from a combination of monitoring techniques that have been employed together for the first time on an active landslide. The results highlight the relative performance of the different techniques, and can provide guidance for researchers and practitioners for selecting and installing appropriate monitoring techniques to assess unstable slopes. Particular attention is given to the spatial and temporal resolutions offered by the different approaches that include: Real Time Kinematic-GPS (RTK-GPS) monitoring of a ground surface marker array, conventional inclinometers, Shape Acceleration Arrays (SAA), tilt meters, active waveguides with Acoustic Emission (AE) monitoring, and piezometers. High spatial resolution information has allowed locating areas of stability and instability across a large slope. This has enabled identification of areas where further monitoring efforts should be focused. High temporal resolution information allowed the capture of 'S'-shaped slope displacement-time behaviour (i.e. phases of slope acceleration, deceleration and stability) in response to elevations in pore-water pressures. This study shows that a well-balanced suite of monitoring techniques that provides high temporal and spatial resolutions on both measurement and slope scale is necessary to fully understand failure and movement mechanisms of slopes. In the case of the Hollin Hill landslide it enabled detailed interpretation of the geomorphological processes governing landslide activity. It highlights the benefit of regularly surveying a network of GPS markers to determine areas for installation of movement monitoring techniques that offer higher resolution both temporally and spatially. The small sensitivity of tilt meter measurements

  1. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    Science.gov (United States)

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  2. Cloud Computing and Internet of Things Concepts Applied on Buildings Data Analysis

    Directory of Open Access Journals (Sweden)

    Hebean Florin-Adrian

    2017-12-01

    Full Text Available Used and developed initially for the IT industry, the Cloud computing and Internet of Things concepts are found at this moment in a lot of sectors of activity, building industry being one of them. These are defined like a global computing, monitoring and analyze network, which is composed of hardware and software resources, with the feature of allocating and dynamically relocating the shared resources, in accordance with user requirements. Data analysis and process optimization techniques based on these new concepts are used increasingly more in the buildings industry area, especially for an optimal operations of the buildings installations and also for increasing occupants comfort. The multitude of building data taken from HVAC sensor, from automation and control systems and from the other systems connected to the network are optimally managed by these new analysis techniques. Through analysis techniques can be identified and manage the issues the arise in operation of building installations like critical alarms, nonfunctional equipment, issues regarding the occupants comfort, for example the upper and lower temperature deviation to the set point and other issues related to equipment maintenance. In this study, a new approach regarding building control is presented and also a generalized methodology for applying data analysis to building services data is described. This methodology is then demonstrated using two case studies.

  3. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  4. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  5. Wavelet transform techniques and signal analysis

    International Nuclear Information System (INIS)

    Perez, R.B.; Mattingly, J.K.; Tennessee Univ., Knoxville, TN; Perez, J.S.

    1993-01-01

    Traditionally, the most widely used signal analysis tool is the Fourier transform which, by producing power spectral densities (PSDs), allows time dependent signals to be studied in the frequency domain. However, the Fourier transform is global -- it extends over the entire time domain -- which makes it ill-suited to study nonstationary signals which exhibit local temporal changes in the signal's frequency content. To analyze nonstationary signals, the family of transforms commonly designated as short-time Fourier transforms (STFTs), capable of identifying temporally localized changes in the signal's frequency content, were developed by employing window functions to isolate temporal regions of the signal. For example, the Gabor STFT uses a Gaussian window. However, the applicability of STFTs is limited by various inadequacies. The Wavelet transform (NW), recently developed by Grossman and Morlet and explored in depth by Daubechies (2) and Mallat, remedies the inadequacies of STFTs. Like the Fourier transform, the WT can be implemented as a discrete transform (DWT) or as a continuous (integral) transform (CWT). This paper briefly illustrates some of the potential applications of the wavelet transform algorithms to signal analysis

  6. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    2010-11-01

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  7. TU-EF-BRD-02: Indicators and Technique Analysis

    International Nuclear Information System (INIS)

    Carlone, M.

    2015-01-01

    peer-reviewed research will be used to highlight the main points. Historical, medical physicists have leveraged many areas of applied physics, engineering and biology to improve radiotherapy. Research on quality and safety is another area where physicists can have an impact. The key to further progress is to clearly define what constitutes quality and safety research for those interested in doing such research and the reviewers of that research. Learning Objectives: List several tools of quality and safety with references to peer-reviewed literature. Describe effects of mental workload on performance. Outline research in quality and safety indicators and technique analysis. Understand what quality and safety research needs to be going forward. Understand the links between cooperative group trials and quality and safety research

  8. Time-of-arrival analysis applied to ELF/VLF wave generation experiments at HAARP

    Science.gov (United States)

    Moore, R. C.; Fujimaru, S.

    2012-12-01

    Time-of-arrival (TOA) analysis is applied to observations performed during ELF/VLF wave generation experiments at the High-frequency Active Auroral Research Program (HAARP) HF transmitter in Gakona, Alaska. In 2012, a variety of ELF/VLF wave generation techniques were employed to identify the dominant source altitude for each case. Observations were performed for beat-wave modulation, AM modulation, STF modulation, ICD modulation, and cubic frequency modulation, among others. For each of these cases, we identify the dominant ELF/VLF source altitude and compare the experimental results with theoretical HF heating predictions.

  9. The differential dieaway technique applied to the measurement of the fissile content of drums of cement encapsulated waste

    International Nuclear Information System (INIS)

    Swinhoe, M.T.

    1986-01-01

    This report describes calculations of the differential dieaway technique as applied to cement encapsulated waste. The main difference from previous applications of the technique are that only one detector position is used (diametrically opposite the neutron source) and the chamber walls are made of concrete. The results show that by rotating the drum the response to fissile material across the central plane of the drum can be made relatively uniform. The absolute size of the response is about 0.4. counts per minute per gram fissile for a neutron source of 10 8 neutrons per second. Problems of neutron and gamma background and water content are considered. (author)

  10. Enhanced nonlinear iterative techniques applied to a non-equilibrium plasma flow

    Energy Technology Data Exchange (ETDEWEB)

    Knoll, D.A.; McHugh, P.R. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1996-12-31

    We study the application of enhanced nonlinear iterative methods to the steady-state solution of a system of two-dimensional convection-diffusion-reaction partial differential equations that describe the partially-ionized plasma flow in the boundary layer of a tokamak fusion reactor. This system of equations is characterized by multiple time and spatial scales, and contains highly anisotropic transport coefficients due to a strong imposed magnetic field. We use Newton`s method to linearize the nonlinear system of equations resulting from an implicit, finite volume discretization of the governing partial differential equations, on a staggered Cartesian mesh. The resulting linear systems are neither symmetric nor positive definite, and are poorly conditioned. Preconditioned Krylov iterative techniques are employed to solve these linear systems. We investigate both a modified and a matrix-free Newton-Krylov implementation, with the goal of reducing CPU cost associated with the numerical formation of the Jacobian. A combination of a damped iteration, one-way multigrid and a pseudo-transient continuation technique are used to enhance global nonlinear convergence and CPU efficiency. GMRES is employed as the Krylov method with Incomplete Lower-Upper(ILU) factorization preconditioning. The goal is to construct a combination of nonlinear and linear iterative techniques for this complex physical problem that optimizes trade-offs between robustness, CPU time, memory requirements, and code complexity. It is shown that a one-way multigrid implementation provides significant CPU savings for fine grid calculations. Performance comparisons of the modified Newton-Krylov and matrix-free Newton-Krylov algorithms will be presented.

  11. X-ray Computed Microtomography technique applied for cementitious materials: A review.

    Science.gov (United States)

    da Silva, Ítalo Batista

    2018-04-01

    The main objective of this article is to present a bibliographical review about the use of the X-ray microtomography method in 3D images processing of cementitious materials microstructure, analyzing the pores microstructure and connectivity network, enabling tthe possibility of building a relationship between permeability and porosity. The use of this technique enables the understanding of physical, chemical and mechanical properties of cementitious materials by publishing good results, considering that the quality and quantity of accessible information were significant and may contribute to the study of cementitious materials development. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Applying Behavioural Science Techniques to the Test and Evaluation of Complex Systems

    National Research Council Canada - National Science Library

    Sproles, Noel

    2000-01-01

    .... A minor case study illustrating the application of qualitative data analysis methods to the evaluation of the Australian Army's Battlefield Command Support System (BCSS) is discussed in order to illustrate the merit of behavioural science methods for T&E.

  13. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Matthew W. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include the inherently weak Raman cross section and susceptibility to fluorescence interference.

  14. A review of post-modern management techniques as currently applied to Turkish forestry.

    Science.gov (United States)

    Dölarslan, Emre Sahin

    2009-01-01

    This paper reviews the effects of six post-modern management concepts as applied to Turkish forestry. Up to now, Turkish forestry has been constrained, both in terms of its operations and internal organization, by a highly bureaucratic system. The application of new thinking in forestry management, however, has recently resulted in new organizational and production concepts that promise to address problems specific to this Turkish industry and bring about positive changes. This paper will elucidate these specific issues and demonstrate how post-modern management thinking is influencing the administration and operational capacity of Turkish forestry within its current structure.

  15. Applying stakeholder Delphi techniques for planning sustainable use of aquatic resources

    DEFF Research Database (Denmark)

    Lund, Søren; Banta, Gary Thomas; Bunting, Stuart W

    2015-01-01

    The HighARCS (Highland Aquatic Resources Conservation and Sustainable Development) project was a participatory research effort to map and better understand the patterns of resource use and livelihoods of communities who utilize highland aquatic resources in five sites across China, India...... and Vietnam. The purpose of this paper is to give an account of how the stakeholder Delphi method was adapted and applied to support the participatory integrated action planning for sustainable use of aquatic resources facilitated within the HighARCS project. An account of the steps taken and results recorded...

  16. Microbiological evaluation of sludge during an improvement process applying the washing technique (selective pressure)

    International Nuclear Information System (INIS)

    Molina P, Francisco; Gonzalez, Maria Elena; Gonzalez, Luz Catalina

    2001-01-01

    In this investigation, the microbial consortiums were evaluated by using characterization by trophic groups and related groups by their sensibility to oxygen, as well as the specific methanogenic activity (SMA) of an acclimated sludge, starting from an aerobium sludge corning from a residual water treatment plant. Later, the technique of improvement by washing was applicated to this sludge, getting inoculum for the starting of an anaerobic reactor of the kind UASB (treatment reactor). At the same time, a control reactor was operated, inoculated with acclimated sludge. Both reactors were operated during 120 days, using brown sugar as substrate, the experimental phase included dates up to 70 operation days, characterizing the sludge at the end of this period. The SMA was analysed using acetic and formic acids as substrates. The results showed activities between 0,45 and 1,39 g DQO-CH 4 /SSV -d. for both substrates. At the end of the experimental phase of the UASB reactor, the sulphate reducer bacteria from the acetate and the lactate were observed as predominant group, followed by the methanogenic hydrogenophilic bacteria. It is important to notice that, with the application of the sludge washing technique, all the tropic groups were increased, with the exception of the lactate fermentative bacteria

  17. Gamma-radiography techniques applied to quality control of welds in water pipe lines

    International Nuclear Information System (INIS)

    Sanchez, W.; Oki, H.

    1974-01-01

    Non-destructive testing of welds may be done by the gamma-radiography technique, in order to detect the presence or absence of discontinuities and defects in the bulk of deposited metal and near the base metal. Gamma-radiography allows the documentation of the test with a complete inspection record, which is a fact not common in other non-destructive testing methods. In the quality control of longitudinal or transversal welds in water pipe lines, two exposition techniques are used: double wall and panoramic exposition. Three different water pipe lines systems have analysed for weld defects, giving a total of 16,000 gamma-radiographies. The tests were made according to the criteria established by the ASME standard. The principal metallic discontinuites found in the weld were: porosity (32%), lack of penetration (29%), lack of fusion (20%), and slag inclusion (19%). The percentage of gamma-radiographies showing welds without defects was 39% (6168 gamma-radiographies). On the other hand, 53% (8502 gamma-radiographies) showed the presence of acceptable discontinuities and 8% (1330 gamma-radiographies) were rejected according to the ASME standards [pt

  18. Fragrance composition of Dendrophylax lindenii (Orchidaceae using a novel technique applied in situ

    Directory of Open Access Journals (Sweden)

    James J. Sadler

    2012-02-01

    Full Text Available The ghost orchid, Dendrophylax lindenii (Lindley Bentham ex Rolfe (Orchidaceae, is one of North America’s rarest and well-known orchids. Native to Cuba and SW Florida where it frequents shaded swamps as an epiphyte, the species has experienced steady decline. Little information exists on D. lindenii’s biology in situ, raising conservation concerns. During the summer of 2009 at an undisclosed population in Collier County, FL, a substantial number (ca. 13 of plants initiated anthesis offering a unique opportunity to study this species in situ. We report a new technique aimed at capturing floral headspace of D. lindenii in situ, and identified volatile compounds using gas chromatography mass spectrometry (GC/MS. All components of the floral scent were identified as terpenoids with the exception of methyl salicylate. The most abundant compound was the sesquiterpene (E,E-α-farnesene (71% followed by (E-β-ocimene (9% and methyl salicylate (8%. Other compounds were: linalool (5%, sabinene (4%, (E-α-bergamotene (2%, α-pinene (1%, and 3-carene (1%. Interestingly, (E,E-α-farnesene has previously been associated with pestiferous insects (e.g., Hemiptera. The other compounds are common floral scent constituents in other angiosperms suggesting that our in situ technique was effective. Volatile capture was, therefore, possible without imposing physical harm (e.g., inflorescence detachment to this rare orchid.

  19. Applying machine learning techniques to the identification of late-onset hypogonadism in elderly men.

    Science.gov (United States)

    Lu, Ti; Hu, Ya-Han; Tsai, Chih-Fong; Liu, Shih-Ping; Chen, Pei-Ling

    2016-01-01

    In the diagnosis of late-onset hypogonadism (LOH), the Androgen Deficiency in the Aging Male (ADAM) questionnaire or Aging Males' Symptoms (AMS) scale can be used to assess related symptoms. Subsequently, blood tests are used to measure serum testosterone levels. However, results obtained using ADAM and AMS have revealed no significant correlations between ADAM and AMS scores and LOH, and the rate of misclassification is high. Recently, many studies have reported significant associations between clinical conditions such as the metabolic syndrome, obesity, lower urinary tract symptoms, and LOH. In this study, we sampled 772 clinical cases of men who completed both a health checkup and two questionnaires (ADAM and AMS). The data were obtained from the largest medical center in Taiwan. Two well-known classification techniques, the decision tree (DT) and logistic regression, were used to construct LOH prediction models on the basis of the aforementioned features. The results indicate that although the sensitivity of ADAM is the highest (0.878), it has the lowest specificity (0.099), which implies that ADAM overestimates LOH occurrence. In addition, DT combined with the AdaBoost technique (AdaBoost DT) has the second highest sensitivity (0.861) and specificity (0.842), resulting in having the best accuracy (0.851) among all classifiers. AdaBoost DT can provide robust predictions that will aid clinical decisions and can help medical staff in accurately assessing the possibilities of LOH occurrence.

  20. New tools, technology and techniques applied in geological sciences: current situation and future perspectives

    International Nuclear Information System (INIS)

    Ulloa, Andres

    2014-01-01

    Technological tools and work methodologies most used in the area of geological sciences are reviewed and described. The various electronic devices such as laptops, palmtops or PDA (personal digital assistant), tablets and smartphones have allowed to take field geological data and store them efficiently. Tablets and smartphones have been convenient for data collection of scientific data by the diversity of sensors that present, portability, autonomy and the possibility to install specific applications. High precision GPS in conjunction with LIDAR technology and sonar technology have been more accessible and used for geological research, generating high resolution three-dimensional models to complement geological studies. Remote sensing techniques such as high penetration radar are used to perform models of the ice thickness and topography in Antarctic. Modern three-dimensional scanning and printing techniques are used in geological science research and teaching. Currently, the advance in the computer technology has allowed to handle three-dimensional models on personal computers efficiently way and with different display options. Some, of the new areas of geology, emerged recently, are mentioned to generate a broad panorama toward where can direct geological researches in the next years [es