WorldWideScience

Sample records for analysis techniques applied

  1. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  2. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  3. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  4. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  6. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  7. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  8. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  9. Applied ALARA techniques

    International Nuclear Information System (INIS)

    Waggoner, L.O.

    1998-01-01

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work

  10. Applied ALARA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Waggoner, L.O.

    1998-02-05

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work.

  11. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  12. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  13. Spatial analysis techniques applied to uranium prospecting in Chihuahua State, Mexico

    Science.gov (United States)

    Hinojosa de la Garza, Octavio R.; Montero Cabrera, María Elena; Sanín, Luz H.; Reyes Cortés, Manuel; Martínez Meyer, Enrique

    2014-07-01

    To estimate the distribution of uranium minerals in Chihuahua, the advanced statistical model "Maximun Entropy Method" (MaxEnt) was applied. A distinguishing feature of this method is that it can fit more complex models in case of small datasets (x and y data), as is the location of uranium ores in the State of Chihuahua. For georeferencing uranium ores, a database from the United States Geological Survey and workgroup of experts in Mexico was used. The main contribution of this paper is the proposal of maximum entropy techniques to obtain the mineral's potential distribution. For this model were used 24 environmental layers like topography, gravimetry, climate (worldclim), soil properties and others that were useful to project the uranium's distribution across the study area. For the validation of the places predicted by the model, comparisons were done with other research of the Mexican Service of Geological Survey, with direct exploration of specific areas and by talks with former exploration workers of the enterprise "Uranio de Mexico". Results. New uranium areas predicted by the model were validated, finding some relationship between the model predictions and geological faults. Conclusions. Modeling by spatial analysis provides additional information to the energy and mineral resources sectors.

  14. Advanced gamma spectrum processing technique applied to the analysis of scattering spectra for determining material thickness

    International Nuclear Information System (INIS)

    Hoang Duc Tam; VNUHCM-University of Science, Ho Chi Minh City; Huynh Dinh Chuong; Tran Thien Thanh; Vo Hoang Nguyen; Hoang Thi Kieu Trang; Chau Van Tao

    2015-01-01

    In this work, an advanced gamma spectrum processing technique is applied to analyze experimental scattering spectra for determining the thickness of C45 heat-resistant steel plates. The single scattering peak of scattering spectra is taken as an advantage to measure the intensity of single scattering photons. Based on these results, the thickness of steel plates is determined with a maximum deviation of real thickness and measured thickness of about 4 %. Monte Carlo simulation using MCNP5 code is also performed to cross check the results, which yields a maximum deviation of 2 %. These results strongly confirm the capability of this technique in analyzing gamma scattering spectra, which is a simple, effective and convenient method for determining material thickness. (author)

  15. Sensitivity analysis techniques applied to a system of hyperbolic conservation laws

    International Nuclear Information System (INIS)

    Weirs, V. Gregory; Kamm, James R.; Swiler, Laura P.; Tarantola, Stefano; Ratto, Marco; Adams, Brian M.; Rider, William J.; Eldred, Michael S.

    2012-01-01

    Sensitivity analysis is comprised of techniques to quantify the effects of the input variables on a set of outputs. In particular, sensitivity indices can be used to infer which input parameters most significantly affect the results of a computational model. With continually increasing computing power, sensitivity analysis has become an important technique by which to understand the behavior of large-scale computer simulations. Many sensitivity analysis methods rely on sampling from distributions of the inputs. Such sampling-based methods can be computationally expensive, requiring many evaluations of the simulation; in this case, the Sobol' method provides an easy and accurate way to compute variance-based measures, provided a sufficient number of model evaluations are available. As an alternative, meta-modeling approaches have been devised to approximate the response surface and estimate various measures of sensitivity. In this work, we consider a variety of sensitivity analysis methods, including different sampling strategies, different meta-models, and different ways of evaluating variance-based sensitivity indices. The problem we consider is the 1-D Riemann problem. By a careful choice of inputs, discontinuous solutions are obtained, leading to discontinuous response surfaces; such surfaces can be particularly problematic for meta-modeling approaches. The goal of this study is to compare the estimated sensitivity indices with exact values and to evaluate the convergence of these estimates with increasing samples sizes and under an increasing number of meta-model evaluations. - Highlights: ► Sensitivity analysis techniques for a model shock physics problem are compared. ► The model problem and the sensitivity analysis problem have exact solutions. ► Subtle details of the method for computing sensitivity indices can affect the results.

  16. Tensometry technique for X-ray diffraction in applied analysis of welding

    International Nuclear Information System (INIS)

    Turibus, S.N.; Caldas, F.C.M.; Miranda, D.M.; Monine, V.I.; Assis, J.T.

    2010-01-01

    This paper presents the analysis of residual stress introduced in welding process. As the stress in a material can induce damages, it is necessary to have a method to identify this residual stress state. For this it was used the non-destructive X-ray diffraction technique to analyze two plates from A36 steel jointed by metal inert gas (MIG) welding. The stress measurements were made by the sin 2 ψ method in weld region of steel plates including analysis of longitudinal and transverse residual stresses in fusion zone, heat affected zone (HAZ) and base metal. To determine the stress distribution along the depth of the welded material it was used removing of superficial layers made by electropolishing. (author)

  17. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    International Nuclear Information System (INIS)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves

    2017-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2 k experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2 3 and the 2 4 , depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  18. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves, E-mail: uandapaula@gmail.com, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2{sup k} experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2{sup 3} and the 2{sup 4}, depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  19. Sensitivity Analysis Techniques Applied in Video Streaming Service on Eucalyptus Cloud Environments

    Directory of Open Access Journals (Sweden)

    Rosangela Melo

    2018-01-01

    Full Text Available Nowdays, several streaming servers are available to provide a variety of multimedia applications such as Video on Demand in cloud computing environments. These environments have the business potential because of the pay-per-use model, as well as the advantages of easy scalability and, up-to-date of the packages and programs. This paper uses hierarchical modeling and different sensitivity analysis techniques to determine the parameters that cause the greatest impact on the availability of a Video on Demand. The results show that distinct approaches provide similar results regarding the sensitivity ranking, with specific exceptions. A combined evaluation indicates that system availability may be improved effectively by focusing on a reduced set of factors that produce large variation on the measure of interest.

  20. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  1. Analysis and simulation of wireless signal propagation applying geostatistical interpolation techniques

    Science.gov (United States)

    Kolyaie, S.; Yaghooti, M.; Majidi, G.

    2011-12-01

    This paper is a part of an ongoing research to examine the capability of geostatistical analysis for mobile networks coverage prediction, simulation and tuning. Mobile network coverage predictions are used to find network coverage gaps and areas with poor serviceability. They are essential data for engineering and management in order to make better decision regarding rollout, planning and optimisation of mobile networks.The objective of this research is to evaluate different interpolation techniques in coverage prediction. In method presented here, raw data collected from drive testing a sample of roads in study area is analysed and various continuous surfaces are created using different interpolation methods. Two general interpolation methods are used in this paper with different variables; first, Inverse Distance Weighting (IDW) with various powers and number of neighbours and second, ordinary kriging with Gaussian, spherical, circular and exponential semivariogram models with different number of neighbours. For the result comparison, we have used check points coming from the same drive test data. Prediction values for check points are extracted from each surface and the differences with actual value are computed. The output of this research helps finding an optimised and accurate model for coverage prediction.

  2. Error analysis of the phase-shifting technique when applied to shadow moire

    International Nuclear Information System (INIS)

    Han, Changwoon; Han Bongtae

    2006-01-01

    An exact solution for the intensity distribution of shadow moire fringes produced by a broad spectrum light is presented. A mathematical study quantifies errors in fractional fringe orders determined by the phase-shifting technique, and its validity is corroborated experimentally. The errors vary cyclically as the distance between the reference grating and the specimen increases. The amplitude of the maximum error is approximately 0.017 fringe, which defines the theoretical limit of resolution enhancement offered by the phase-shifting technique

  3. Nuclear and conventional techniques applied to the analysis of Purhepecha metals of the Pareyon collection

    International Nuclear Information System (INIS)

    Mendez, U.; Tenorio C, D.; Ruvalcaba, J.L.; Lopez, J.A.

    2005-01-01

    The main objective of this investigation was to determine the composition and microstructure of 13 metallic devices by means of the nuclear techniques of PIXE, RBS and conventional; which were elaborated starting from copper and gold, and they were in the offering of a tarasc personage located in the 'Matamoros' porch in Uruapan, Michoacan, Mexico. (Author)

  4. Condition monitoring and signature analysis techniques as applied to Madras Atomic Power Station (MAPS) [Paper No.: VIA - 1

    International Nuclear Information System (INIS)

    Rangarajan, V.; Suryanarayana, L.

    1981-01-01

    The technique of vibration signature analysis for identifying the machine troubles in their early stages is explained. The advantage is that a timely corrective action can be planned to avoid breakdowns and unplanned shutdowns. At the Madras Atomic Power Station (MAPS), this technique is applied to regularly monitor vibrations of equipment and thus is serving as a tool for doing corrective maintenance of equipment. Case studies of application of this technique to main boiler feed pumps, moderation pump motors, centrifugal chiller, ventilation system fans, thermal shield ventilation fans, filtered water pumps, emergency process sea water pumps, and antifriction bearings of MAPS are presented. Condition monitoring during commissioning and subsequent operation could indicate defects. Corrective actions which were taken are described. (M.G.B.)

  5. ANALYSIS DATA SETS USING HYBRID TECHNIQUES APPLIED ARTIFICIAL INTELLIGENCE BASED PRODUCTION SYSTEMS INTEGRATED DESIGN

    OpenAIRE

    Daniel-Petru GHENCEA; Miron ZAPCIU; Claudiu-Florinel BISU; Elena-Iuliana BOTEANU; Elena-Luminiţa OLTEANU

    2017-01-01

    The paper proposes a prediction model of behavior spindle from the point of view of the thermal deformations and the level of the vibrations by highlighting and processing the characteristic equations. This is a model analysis for the shaft with similar electro-mechanical characteristics can be achieved using a hybrid analysis based on artificial intelligence (genetic algorithms - artificial neural networks - fuzzy logic). The paper presents a prediction mode obtaining valid range of values f...

  6. Applying computational geometry techniques for advanced feature analysis in atom probe data

    International Nuclear Information System (INIS)

    Felfer, Peter; Ceguerra, Anna; Ringer, Simon; Cairney, Julie

    2013-01-01

    In this paper we present new methods for feature analysis in atom probe tomography data that have useful applications in materials characterisation. The analysis works on the principle of Voronoi subvolumes and piecewise linear approximations, and feature delineation based on the distance to the centre of mass of a subvolume (DCOM). Based on the coordinate systems defined by these approximations, two examples are shown of the new types of analyses that can be performed. The first is the analysis of line-like-objects (i.e. dislocations) using both proxigrams and line-excess plots. The second is interfacial excess mapping of an InGaAs quantum dot. - Highlights: • Computational geometry is used to detect and analyse features within atom probe data. • Limitations of conventional feature detection are overcome by using atomic density gradients. • 0D, 1D, 2D and 3D features can be analysed by using Voronoi tessellation for spatial binning. • New, robust analysis methods are demonstrated, including line and interfacial excess mapping

  7. Statistical techniques applied to aerial radiometric surveys (STAARS): principal components analysis user's manual

    International Nuclear Information System (INIS)

    Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.

    1981-01-01

    A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From this analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained

  8. ANALYSIS DATA SETS USING HYBRID TECHNIQUES APPLIED ARTIFICIAL INTELLIGENCE BASED PRODUCTION SYSTEMS INTEGRATED DESIGN

    Directory of Open Access Journals (Sweden)

    Daniel-Petru GHENCEA

    2017-06-01

    Full Text Available The paper proposes a prediction model of behavior spindle from the point of view of the thermal deformations and the level of the vibrations by highlighting and processing the characteristic equations. This is a model analysis for the shaft with similar electro-mechanical characteristics can be achieved using a hybrid analysis based on artificial intelligence (genetic algorithms - artificial neural networks - fuzzy logic. The paper presents a prediction mode obtaining valid range of values for spindles with similar characteristics based on measured data sets from a few spindles test without additional measures being required. Extracting polynomial functions of graphs resulting from simultaneous measurements and predict the dynamics of the two features with multi-objective criterion is the main advantage of this method.

  9. Delamination of plasters applied to historical masonry walls: analysis by acoustic emission technique and numerical model

    Science.gov (United States)

    Grazzini, A.; Lacidogna, G.; Valente, S.; Accornero, F.

    2018-06-01

    Masonry walls of historical buildings are subject to rising damp effects due to capillary or rain infiltrations, which in the time produce decay and delamination of historical plasters. In the restoration of masonry buildings, the plaster detachment frequently occurs because of mechanical incompatibility in repair mortar. An innovative laboratory procedure is described for test mechanical adhesion of new repair mortars. Compression static tests were carried out on composite specimens stone block-repair mortar, which specific geometry can test the de-bonding process of mortar in adherence with a stone masonry structure. The acoustic emission (AE) technique was employed for estimating the amount of energy released from fracture propagation in adherence surface between mortar and stone. A numerical simulation was elaborated based on the cohesive crack model. The evolution of detachment process of mortar in a coupled stone brick-mortar system was analysed by triangulation of AE signals, which can improve the numerical model and predict the type of failure in the adhesion surface of repair plaster. Through the cohesive crack model, it was possible to interpret theoretically the de-bonding phenomena occurring at the interface between stone block and mortar. Therefore, the mechanical behaviour of the interface is characterized.

  10. Analysis and Design of International Emission Trading Markets Applying System Dynamics Techniques

    Science.gov (United States)

    Hu, Bo; Pickl, Stefan

    2010-11-01

    The design and analysis of international emission trading markets is an important actual challenge. Time-discrete models are needed to understand and optimize these procedures. We give an introduction into this scientific area and present actual modeling approaches. Furthermore, we develop a model which is embedded in a holistic problem solution. Measures for energy efficiency are characterized. The economic time-discrete "cap-and-trade" mechanism is influenced by various underlying anticipatory effects. With a systematic dynamic approach the effects can be examined. First numerical results show that fair international emissions trading can only be conducted with the use of protective export duties. Furthermore a comparatively high price which evokes emission reduction inevitably has an inhibiting effect on economic growth according to our model. As it always has been expected it is not without difficulty to find a balance between economic growth and emission reduction. It can be anticipated using our System Dynamics model simulation that substantial changes must be taken place before international emissions trading markets can contribute to global GHG emissions mitigation.

  11. Pre-analysis techniques applied to area-based correlation aiming Digital Terrain Model generation

    Directory of Open Access Journals (Sweden)

    Maurício Galo

    2005-12-01

    Full Text Available Area-based matching is an useful procedure in some photogrammetric processes and its results are of crucial importance in applications such as relative orientation, phototriangulation and Digital Terrain Model generation. The successful determination of correspondence depends on radiometric and geometric factors. Considering these aspects, the use of procedures that previously estimate the quality of the parameters to be computed is a relevant issue. This paper describes these procedures and it is shown that the quality prediction can be computed before performing matching by correlation, trough the analysis of the reference window. This procedure can be incorporated in the correspondence process for Digital Terrain Model generation and Phototriangulation. The proposed approach comprises the estimation of the variance matrix of the translations from the gray levels in the reference window and the reduction of the search space using the knowledge of the epipolar geometry. As a consequence, the correlation process becomes more reliable, avoiding the application of matching procedures in doubtful areas. Some experiments with simulated and real data are presented, evidencing the efficiency of the studied strategy.

  12. A comparison of quantitative reconstruction techniques for PIXE-tomography analysis applied to biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Beasley, D.G., E-mail: dgbeasley@ctn.ist.utl.pt [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Alves, L.C. [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Barberet, Ph.; Bourret, S.; Devès, G.; Gordillo, N.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Le Trequesser, Q. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Institut de Chimie de la Matière Condensée de Bordeaux (ICMCB, UPR9048) CNRS, Université de Bordeaux, 87 avenue du Dr. A. Schweitzer, Pessac F-33608 (France); Marques, A.C. [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Silva, R.C. da [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal)

    2014-07-15

    The tomographic reconstruction of biological specimens requires robust algorithms, able to deal with low density contrast and low element concentrations. At the IST/ITN microprobe facility new GPU-accelerated reconstruction software, JPIXET, has been developed, which can significantly increase the speed of quantitative reconstruction of Proton Induced X-ray Emission Tomography (PIXE-T) data. It has a user-friendly graphical user interface for pre-processing, data analysis and reconstruction of PIXE-T and Scanning Transmission Ion Microscopy Tomography (STIM-T). The reconstruction of PIXE-T data is performed using either an algorithm based on a GPU-accelerated version of the Maximum Likelihood Expectation Maximisation (MLEM) method or a GPU-accelerated version of the Discrete Image Space Reconstruction Algorithm (DISRA) (Sakellariou (2001) [2]). The original DISRA, its accelerated version, and the MLEM algorithm, were compared for the reconstruction of a biological sample of Caenorhabditis elegans – a small worm. This sample was analysed at the microbeam line of the AIFIRA facility of CENBG, Bordeaux. A qualitative PIXE-T reconstruction was obtained using the CENBG software package TomoRebuild (Habchi et al. (2013) [6]). The effects of pre-processing and experimental conditions on the elemental concentrations are discussed.

  13. Analysis of Arbitrary Reflector Antennas Applying the Geometrical Theory of Diffraction Together with the Master Points Technique

    Directory of Open Access Journals (Sweden)

    María Jesús Algar

    2013-01-01

    Full Text Available An efficient approach for the analysis of surface conformed reflector antennas fed arbitrarily is presented. The near field in a large number of sampling points in the aperture of the reflector is obtained applying the Geometrical Theory of Diffraction (GTD. A new technique named Master Points has been developed to reduce the complexity of the ray-tracing computations. The combination of both GTD and Master Points reduces the time requirements of this kind of analysis. To validate the new approach, several reflectors and the effects on the radiation pattern caused by shifting the feed and introducing different obstacles have been considered concerning both simple and complex geometries. The results of these analyses have been compared with the Method of Moments (MoM results.

  14. Influence of elemental concentration in soil on vegetables applying analytical nuclear techniques: k0-instrumental neutron activation analysis and radiometry

    International Nuclear Information System (INIS)

    Menezes, Maria Angela de B.C.; Mingote, Raquel Maia; Silva, Lucilene Guerra e; Pedrosa, Lorena Gomes

    2005-01-01

    Samples from two vegetable gardens where analysed aiming at determining the elemental concentration. The vegetables selected to be studied are grown by the people for their own use and are present in daily meal. One vegetable garden studied is close to a mining activity in a region inserted in the Iron Quadrangle (Quadrilatero Ferrifero), located in the Brazilian state of Minas Gerais. This region is considered one of the richest mineral bearing regions in the world. Another vegetable garden studied is far from this region and without any mining activity It was also studied as a comparative site. This assessment was carried out to evaluate the elemental concentration in soil and vegetables, matrixes connected with the chain food, applying the k 0 -Instrumental Neutron Activation Analysis (k 0 -INAA) at the Laboratory for Neutron Activation Analysis. However, this work reports only the results of thorium, uranium and rare-earth obtained in samples collected during the dry season, focusing on the influence of these elements on vegetable elemental composition. Results of natural radioactivity determined by Gross Alpha and Gross Beta measurements, are also reported. This study is related to the BRA 11920 project, entitled 'Iron Quadrangle, Brazil: assessment of health impact caused by mining pollutants through chain food applying nuclear and related techniques', one of the researches co-ordinated by the IAEA (Vienna, Austria). (author)

  15. Results of Applying Cultural Domain Analysis Techniques and Implications for the Design of Complementary Feeding Interventions in Northern Senegal.

    Science.gov (United States)

    Zobrist, Stephanie; Kalra, Nikhila; Pelto, Gretel; Wittenbrink, Brittney; Milani, Peiman; Diallo, Abdoulaye Moussa; Ndoye, Tidiane; Wone, Issa; Parker, Megan

    2017-12-01

    Designing effective nutrition interventions for infants and young children requires knowledge about the population to which the intervention is directed, including insights into the cognitive systems and values that inform caregiver feeding practices. To apply cultural domain analysis techniques in the context of implementation research for the purpose of understanding caregivers' knowledge frameworks in Northern Senegal with respect to infant and young child (IYC) feeding. This study was intended to inform decisions for interventions to improve infant and young child nutrition. Modules from the Focused Ethnographic Study for Infant and Young Child Feeding Manual were employed in interviews with a sample of 126 key informants and caregivers from rural and peri-urban sites in the Saint-Louis region of northern Senegal. Descriptive statistics, cluster analysis, and qualitative thematic analysis were used to analyze the data. Cluster analysis showed that caregivers identified 6 food clusters: heavy foods, light foods, snack foods, foraged foods, packaged foods, and foods that are good for the body. The study also revealed similarities and differences between the 2 study sites in caregivers' knowledge frameworks. The demonstration of differences between biomedical concepts of nutrition and the knowledge frameworks of northern Senegalese women with regard to IYC feeding highlights the value of knowledge about emic perspectives of local communities to help guide decisions about interventions to improve nutrition.

  16. Applying CFD in the Analysis of Heavy-Oil Transportation in Curved Pipes Using Core-Flow Technique

    Directory of Open Access Journals (Sweden)

    S Conceição

    2017-06-01

    Full Text Available Multiphase flow of oil, gas and water occurs in the petroleum industry from the reservoir to the processing units. The occurrence of heavy oils in the world is increasing significantly and points to the need for greater investment in the reservoirs exploitation and, consequently, to the development of new technologies for the production and transport of this oil. Therefore, it is interesting improve techniques to ensure an increase in energy efficiency in the transport of this oil. The core-flow technique is one of the most advantageous methods of lifting and transporting of oil. The core-flow technique does not alter the oil viscosity, but change the flow pattern and thus, reducing friction during heavy oil transportation. This flow pattern is characterized by a fine water pellicle that is formed close to the inner wall of the pipe, aging as lubricant of the oil flowing in the core of the pipe. In this sense, the objective of this paper is to study the isothermal flow of heavy oil in curved pipelines, employing the core-flow technique. A three-dimensional, transient and isothermal mathematical model that considers the mixture and k-e  turbulence models to address the gas-water-heavy oil three-phase flow in the pipe was applied for analysis. Simulations with different flow patterns of the involved phases (oil-gas-water have been done, in order to optimize the transport of heavy oils. Results of pressure and volumetric fraction distribution of the involved phases are presented and analyzed. It was verified that the oil core lubricated by a fine water layer flowing in the pipe considerably decreases pressure drop.

  17. Handbook of Applied Analysis

    CERN Document Server

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  18. Judging complex movement performances for excellence: a principal components analysis-based technique applied to competitive diving.

    Science.gov (United States)

    Young, Cole; Reinkensmeyer, David J

    2014-08-01

    Athletes rely on subjective assessment of complex movements from coaches and judges to improve their motor skills. In some sports, such as diving, snowboard half pipe, gymnastics, and figure skating, subjective scoring forms the basis for competition. It is currently unclear whether this scoring process can be mathematically modeled; doing so could provide insight into what motor skill is. Principal components analysis has been proposed as a motion analysis method for identifying fundamental units of coordination. We used PCA to analyze movement quality of dives taken from USA Diving's 2009 World Team Selection Camp, first identifying eigenpostures associated with dives, and then using the eigenpostures and their temporal weighting coefficients, as well as elements commonly assumed to affect scoring - gross body path, splash area, and board tip motion - to identify eigendives. Within this eigendive space we predicted actual judges' scores using linear regression. This technique rated dives with accuracy comparable to the human judges. The temporal weighting of the eigenpostures, body center path, splash area, and board tip motion affected the score, but not the eigenpostures themselves. These results illustrate that (1) subjective scoring in a competitive diving event can be mathematically modeled; (2) the elements commonly assumed to affect dive scoring actually do affect scoring (3) skill in elite diving is more associated with the gross body path and the effect of the movement on the board and water than the units of coordination that PCA extracts, which might reflect the high level of technique these divers had achieved. We also illustrate how eigendives can be used to produce dive animations that an observer can distort continuously from poor to excellent, which is a novel approach to performance visualization. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  20. Applied Behavior Analysis

    Science.gov (United States)

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  1. Nuclear and conventional techniques applied to the analysis of prehispanic metals of the Templo Mayor of Tenochtitlan

    International Nuclear Information System (INIS)

    Mendez M, U.

    2003-01-01

    The use of the such experimental techniques as: PIXE, RBS, Metallography and Sem, applied to the characterization of pre hispanic metals of copper and gold coming from 9 offerings of the Templo Mayor of Tenochtitlan, are possible to obtain results and information sustained on such aspects as technological development and cultural and commercial exchange besides a relative chronology, as well as aspects related with conservation, authenticity, symbolic association and social meaning of the offerings. After way but it specifies, it will be given to know each one of the objectives outlined for this study: To carry out interpretations on technical of factory, stylistic designs and cultural and commercial exchanges starting from aspects like: microstructure, elementary composition, type of alloys, welding existence, golden superficial, and conservation, they can be had. To determine the technological advance that means the prosecution of the metallic materials and to know their location in the archaeological context, as a means for the interpretation of the social significance of the offering. To know the possible association symbolic-religious from the metallic objects offering to the deities; starting from significant characteristics as they are: color, forms and function. To establish if it is possible to know if the devices found in the offerings are of the same temporality in which one carries out this, or at least, to locate to the devices inside the two stages of the development of the metallurgy these they are known as the period of the native copper and the period of the alloys, this helped to determine a relative chronology of when the objects were manufactured. To confirm the authenticity of the devices. To determine, in a way specifies, the conservation grade in that they are the pieces. To corroborate some of the manufacture processes This is achieved by means of the reproduction of objects in laboratory, to establish comparisons and differences among pre

  2. New technique of insitu soil moisture sampling for environmental isotope analysis applied at 'Pilat-dune' near Bordeaux

    International Nuclear Information System (INIS)

    Thoma, G.; Esser, N.; Sonntag, C.; Weiss, W.; Rudolph, J.; Leveque, P.

    1978-01-01

    A new soil-air suction method with soil water vapor adsorption by 4 A-molecular sieve provides soil moisture samples from various depths for environmental isotope analysis and yields soil temperature profiles. A field tritium tracer experiment shows that this insitu sampling method has an isotope profile resolution of about 5-10 cm only. Application of this method in the Pilat sand dune (Bordeaux/France) yielded deuterium and tritium profiles down to 25 meters depth. Bomb tritium measurements of monthly lysimeter percolate samples available since 1961 show that the tritium response has a mean delay of 5 months in case of a sand lysimeter and of 2.5 years for a loess loam lysimeter. A simple HETP model simulates the layered downward movement of soil water and the longitudinal dispersion in the lysimeters. Field capacity and evapotranspiration taken as open parameters yield tritium concentration values of the lysimeters' percolate which are in close agreement with the experimental results. Based on local meteorological data the HETP model applied to tritium tracer experiments in the unsaturated zone further yiels an individual prediction of the momentary tracer position and of the soil moisture distribution. This prediction can be checked experimentally at selected intervals by coring. (orig.) [de

  3. Functional reasoning, explanation and analysis: Part 1: a survey on theories, techniques and applied systems. Part 2: qualitative function formation technique

    International Nuclear Information System (INIS)

    Far, B.H.

    1992-01-01

    Functional Reasoning (FR) enables people to derive the purpose of objects and explain their functions, JAERI's 'Human Acts Simulation Program (HASP)', started from 1987, has the goal of developing programs of the underlying technologies for intelligent robots by imitating the intelligent behavior of humans. FR is considered a useful reasoning method in HASP and applied to understand function of tools and objects in the Toolbox Project. In this report, first, the results of the diverse FR researches within a variety of disciplines are reviewed and the common core and basic problems are identified. Then the qualitative function formation (QFF) technique is introduced. Some novel points are: extending the common qualitative models to include interactions and timing of events by defining temporal and dependency constraints, and binding it with the conventional qualitative simulation. Function concepts are defined as interpretations of either a persistence or an order in the sequence of states, using the trace of the qualitative state vector derived by qualitative simulation on the extended qualitative model. This offers solution to some of the FR problems and leads to a method for generalization and comparison of functions of different objects. (author) 85 refs

  4. Early counterpulse technique applied to vacuum interrupters

    International Nuclear Information System (INIS)

    Warren, R.W.

    1979-11-01

    Interruption of dc currents using counterpulse techniques is investigated with vacuum interrupters and a novel approach in which the counterpulse is applied before contact separation. Important increases have been achieved in this way in the maximum interruptible current as well as large reductions in contact erosion. The factors establishing these new limits are presented and ways are discussed to make further improvements

  5. Statistical techniques applied to aerial radiometric surveys (STAARS): series introduction and the principal-components-analysis method

    International Nuclear Information System (INIS)

    Pirkle, F.L.

    1981-04-01

    STAARS is a new series which is being published to disseminate information concerning statistical procedures for interpreting aerial radiometric data. The application of a particular data interpretation technique to geologic understanding for delineating regions favorable to uranium deposition is the primary concern of STAARS. Statements concerning the utility of a technique on aerial reconnaissance data as well as detailed aerial survey data will be included

  6. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  7. Applied survival analysis using R

    CERN Document Server

    Moore, Dirk F

    2016-01-01

    Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

  8. Early counterpulse technique applied to vacuum interrupters

    International Nuclear Information System (INIS)

    Warren, R.W.

    1979-01-01

    Interruption of dc currents using counterpulse techniques is investigated with vacuum interrupters and a novel approach in which the counterpulse is applied before contact separation. Important increases have been achieved in this way in the maximum interruptible current and large reductions in contact erosion. The factors establishing these new limits are presented and ways are discussed to make further improvements to the maximum interruptible current

  9. Nuclear techniques (PIXE and RBS) applied to analysis of pre hispanic metals of the Templo Mayor at Tenochtitlan

    International Nuclear Information System (INIS)

    Mendez U, I.; Tenorio, D.; Galvan, J.L.

    2000-01-01

    This work has the objective of determining by means of the utilization of nuclear techniques (PIXE and RBS) the composition and the alloy type of diverse aztec ornaments corresponding to Post classic period, they were manufactured principally with copper and gold such as bells, beads and disks; all they belonging at 9 oblations of Templo Mayor of Tenochtitlan. It is presented here briefly the historical and archaeological antecedents of the devices as well as the analytical methods for conclude with the results obtained. (Author)

  10. Automated Damage Onset Analysis Techniques Applied to KDP Damage and the Zeus Small Area Damage Test Facility

    International Nuclear Information System (INIS)

    Sharp, R.; Runkel, M.

    1999-01-01

    Automated damage testing of KDP using LLNL's Zeus automated damage test system has allowed the statistics of KDP bulk damage to be investigated. Samples are now characterized by the cumulative damage probability curve, or S-curve, that is generated from hundreds of individual test sites per sample. A HeNe laser/PMT scatter diagnostic is used to determine the onset of damage at each test site. The nature of KDP bulk damage is such that each scatter signal may possess many different indicators of a damage event. Because of this, the determination of the initial onset for each scatter trace is not a straightforward affair and has required considerable manual analysis. The amount of testing required by crystal development for the National Ignition Facility (NIF) has made it impractical to continue analysis by hand. Because of this, we have developed and implemented algorithms for analyzing the scatter traces by computer. We discuss the signal cleaning algorithms and damage determination criteria that have lead to the successful implementation of a LabView based analysis code. For the typical R/1 damage data set, the program can find the correct damage onset in more than 80% of the cases, with the remaining 20% being left to operator determination. The potential time savings for data analysis is on the order of ∼ 100X over manual analysis and is expected to result in the savings of at least 400 man-hours over the next 3 years of NIF quality assurance testing

  11. Computational optimization techniques applied to microgrids planning

    DEFF Research Database (Denmark)

    Gamarra, Carlos; Guerrero, Josep M.

    2015-01-01

    Microgrids are expected to become part of the next electric power system evolution, not only in rural and remote areas but also in urban communities. Since microgrids are expected to coexist with traditional power grids (such as district heating does with traditional heating systems......), their planning process must be addressed to economic feasibility, as a long-term stability guarantee. Planning a microgrid is a complex process due to existing alternatives, goals, constraints and uncertainties. Usually planning goals conflict each other and, as a consequence, different optimization problems...... appear along the planning process. In this context, technical literature about optimization techniques applied to microgrid planning have been reviewed and the guidelines for innovative planning methodologies focused on economic feasibility can be defined. Finally, some trending techniques and new...

  12. Modern problems in applied analysis

    CERN Document Server

    Rogosin, Sergei

    2018-01-01

    This book features a collection of recent findings in Applied Real and Complex Analysis that were presented at the 3rd International Conference “Boundary Value Problems, Functional Equations and Applications” (BAF-3), held in Rzeszow, Poland on 20-23 April 2016. The contributions presented here develop a technique related to the scope of the workshop and touching on the fields of differential and functional equations, complex and real analysis, with a special emphasis on topics related to boundary value problems. Further, the papers discuss various applications of the technique, mainly in solid mechanics (crack propagation, conductivity of composite materials), biomechanics (viscoelastic behavior of the periodontal ligament, modeling of swarms) and fluid dynamics (Stokes and Brinkman type flows, Hele-Shaw type flows). The book is addressed to all readers who are interested in the development and application of innovative research results that can help solve theoretical and real-world problems.

  13. Surface analytical techniques applied to minerals processing

    International Nuclear Information System (INIS)

    Smart, R.St.C.

    1991-01-01

    An understanding of the chemical and physical forms of the chemically altered layers on the surfaces of base metal sulphides, particularly in the form of hydroxides, oxyhydroxides and oxides, and the changes that occur in them during minerals processing lies at the core of a complete description of flotation chemistry. This paper reviews the application of a variety of surface-sensitive techniques and methodologies applied to the study of surface layers on single minerals, mixed minerals, synthetic ores and real ores. Evidence from combined XPS/SAM/SEM studies have provided images and analyses of three forms of oxide, oxyhydroxide and hydroxide products on the surfaces of single sulphide minerals, mineral mixtures and complex sulphide ores. 4 refs., 2 tabs., 4 figs

  14. Gas chromatography/ion trap mass spectrometry applied for the analysis of triazine herbicides in environmental waters by an isotope dilution technique

    International Nuclear Information System (INIS)

    Cai Zongwei; Wang Dongli; Ma, W.T.

    2004-01-01

    A gas chromatography/ion trap mass spectrometry method was developed for the analysis of simazine, atrazine, cyanazine, as well as the degradation products of atrazine, such as deethylatrazine and deisopropylatrazine in environmental water samples. Isotope dilution technique was applied for the quantitative analysis of atrazine in water at low ng/l levels. One liter of water sample spiked with stable isotope internal standard atrazine-d 5 was extracted with a C 18 solid-phase extraction cartridge. The analysis was performed on an ion trap mass spectrometer operated in MS/MS method. The extraction recoveries were in the range of 83-94% for the triazine herbicides in water at the concentrations of 24, 200, and 1000 ng/l, while poor recoveries were obtained for the degradation products of atrazine. The relative standard deviation (R.S.D.) were within the range of 3.2-16.1%. The detection limits of the method were between 0.75 and 12 ng/l when 1 l of water was analyzed. The method was successfully applied to analyze environmental water samples collected from a reservoir and a river in Hong Kong for atrazine detected at concentrations between 3.4 and 26 ng/l

  15. Motion Capture Technique Applied Research in Sports Technique Diagnosis

    Directory of Open Access Journals (Sweden)

    Zhiwu LIU

    2014-09-01

    Full Text Available The motion capture technology system definition is described in the paper, and its components are researched, the key parameters are obtained from motion technique, the quantitative analysis are made on technical movements, the method of motion capture technology is proposed in sport technical diagnosis. That motion capture step includes calibration system, to attached landmarks to the tester; to capture trajectory, and to analyze the collected data.

  16. Applying DEA Technique to Library Evaluation in Academic Research Libraries.

    Science.gov (United States)

    Shim, Wonsik

    2003-01-01

    This study applied an analytical technique called Data Envelopment Analysis (DEA) to calculate the relative technical efficiency of 95 academic research libraries, all members of the Association of Research Libraries. DEA, with the proper model of library inputs and outputs, can reveal best practices in the peer groups, as well as the technical…

  17. Nuclear analytical techniques applied to forensic chemistry

    International Nuclear Information System (INIS)

    Nicolau, Veronica; Montoro, Silvia; Pratta, Nora; Giandomenico, Angel Di

    1999-01-01

    Gun shot residues produced by firing guns are mainly composed by visible particles. The individual characterization of these particles allows distinguishing those ones containing heavy metals, from gun shot residues, from those having a different origin or history. In this work, the results obtained from the study of gun shot residues particles collected from hands are presented. The aim of the analysis is to establish whether a person has shot a firing gun has been in contact with one after the shot has been produced. As reference samples, particles collected hands of persons affected to different activities were studied to make comparisons. The complete study was based on the application of nuclear analytical techniques such as Scanning Electron Microscopy, Energy Dispersive X Ray Electron Probe Microanalysis and Graphite Furnace Atomic Absorption Spectrometry. The essays allow to be completed within time compatible with the forensic requirements. (author)

  18. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  19. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  20. 3-D portal image analysis in clinical practice: an evaluation of 2-D and 3-D analysis techniques as applied to 30 prostate cancer patients

    International Nuclear Information System (INIS)

    Remeijer, Peter; Geerlof, Erik; Ploeger, Lennert; Gilhuijs, Kenneth; Herk, Marcel van; Lebesque, Joos V.

    2000-01-01

    Purpose: To investigate the clinical importance and feasibility of a 3-D portal image analysis method in comparison with a standard 2-D portal image analysis method for pelvic irradiation techniques. Methods and Materials: In this study, images of 30 patients who were treated for prostate cancer were used. A total of 837 imaged fields were analyzed by a single technologist, using automatic 2-D and 3-D techniques independently. Standard deviations (SDs) of the random, systematic, and overall variations, and the overall mean were calculated for the resulting data sets (2-D and 3-D), in the three principal directions (left-right [L-R], cranial-caudal [C-C], anterior-posterior [A-P]). The 3-D analysis included rotations as well. For the translational differences between the three data sets, the overall SD and overall mean were computed. The influence of out-of-plane rotations on the 2-D registration accuracy was determined by analyzing the difference between the 2-D and 3-D translation data as function of rotations. To assess the reliability of the 2-D and 3-D methods, the number of times the automatic match was manually adjusted was counted. Finally, an estimate of the workload was made. Results: The SDs of the random and systematic components of the rotations around the three orthogonal axes were 1.1 (L-R), 0.6 (C-C), 0.5 (A-P) and 0.9 (L-R), 0.6 (C-C), 0.8 (A-P) degrees, respectively. The overall mean rotation around the L-R axis was 0.7 deg., which deviated significantly from zero. Translational setup errors were comparable for 2-D and 3-D analysis (ranging from 1.4 to 2.2 mm SD and from 1.5 to 2.5 mm SD, respectively). The variation of the difference between the 2-D and 3-D translation data increased from 1.1 mm (SD) for zero rotations to 2.7 mm (SD) for out-of-plane rotations of 3 deg., due to a reduced 2-D registration accuracy for large rotations. The number of times the analysis was not considered acceptable and was manually adjusted was 44% for the 2-D

  1. Applying Brainstorming Techniques to EFL Classroom

    OpenAIRE

    Toshiya, Oishi; 湘北短期大学; aPart-time Lecturer at Shohoku College

    2015-01-01

    This paper focuses on brainstorming techniques for English language learners. From the author's teaching experiences at Shohoku College during the academic year 2014-2015, the importance of brainstorming techniques was made evident. The author explored three elements of brainstorming techniques for writing using literaturereviews: lack of awareness, connecting to prior knowledge, and creativity. The literature reviews showed the advantage of using brainstorming techniques in an English compos...

  2. Applying the change vector analysis technique to assess the desertification risk in the south-west of Romania in the period 1984-2011.

    Science.gov (United States)

    Vorovencii, Iosif

    2017-09-26

    The desertification risk affects around 40% of the agricultural land in various regions of Romania. The purpose of this study is to analyse the risk of desertification in the south-west of Romania in the period 1984-2011 using the change vector analysis (CVA) technique and Landsat thematic mapper (TM) satellite images. CVA was applied to combinations of normalised difference vegetation index (NDVI)-albedo, NDVI-bare soil index (BI) and tasselled cap greenness (TCG)-tasselled cap brightness (TCB). The combination NDVI-albedo proved to be the best in assessing the desertification risk, with an overall accuracy of 87.67%, identifying a desertification risk on 25.16% of the studied period. The classification of the maps was performed for the following classes: desertification risk, re-growing and persistence. Four degrees of desertification risk and re-growing were used: low, medium, high and extreme. Using the combination NDVI-albedo, 0.53% of the analysed surface was assessed as having an extreme degree of desertification risk, 3.93% a high degree, 8.72% a medium degree and 11.98% a low degree. The driving forces behind the risk of desertification are both anthropogenic and climatic causes. The anthropogenic causes include the destruction of the irrigation system, deforestation, the destruction of the forest shelterbelts, the fragmentation of agricultural land and its inefficient management. Climatic causes refer to increase of temperatures, frequent and prolonged droughts and decline of the amount of precipitation.

  3. Nuclear radioactive techniques applied to materials research

    CERN Document Server

    Correia, João Guilherme; Wahl, Ulrich

    2011-01-01

    In this paper we review materials characterization techniques using radioactive isotopes at the ISOLDE/CERN facility. At ISOLDE intense beams of chemically clean radioactive isotopes are provided by selective ion-sources and high-resolution isotope separators, which are coupled on-line with particle accelerators. There, new experiments are performed by an increasing number of materials researchers, which use nuclear spectroscopic techniques such as Mössbauer, Perturbed Angular Correlations (PAC), beta-NMR and Emission Channeling with short-lived isotopes not available elsewhere. Additionally, diffusion studies and traditionally non-radioactive techniques as Deep Level Transient Spectroscopy, Hall effect and Photoluminescence measurements are performed on radioactive doped samples, providing in this way the element signature upon correlation of the time dependence of the signal with the isotope transmutation half-life. Current developments, applications and perspectives of using radioactive ion beams and tech...

  4. Determination of palladium in biological samples applying nuclear analytical techniques

    International Nuclear Information System (INIS)

    Cavalcante, Cassio Q.; Sato, Ivone M.; Salvador, Vera L. R.; Saiki, Mitiko

    2008-01-01

    This study presents Pd determinations in bovine tissue samples containing palladium prepared in the laboratory, and CCQM-P63 automotive catalyst materials of the Proficiency Test, using instrumental thermal and epithermal neutron activation analysis and energy dispersive X-ray fluorescence techniques. Solvent extraction and solid phase extraction procedures were also applied to separate Pd from interfering elements before the irradiation in the nuclear reactor. The results obtained by different techniques were compared against each other to examine sensitivity, precision and accuracy. (author)

  5. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  6. Applying Mixed Methods Techniques in Strategic Planning

    Science.gov (United States)

    Voorhees, Richard A.

    2008-01-01

    In its most basic form, strategic planning is a process of anticipating change, identifying new opportunities, and executing strategy. The use of mixed methods, blending quantitative and qualitative analytical techniques and data, in the process of assembling a strategic plan can help to ensure a successful outcome. In this article, the author…

  7. Applying Nonverbal Techniques to Organizational Diagnosis.

    Science.gov (United States)

    Tubbs, Stewart L.; Koske, W. Cary

    Ongoing research programs conducted at General Motors Institute are motivated by the practical objective of improving the company's organizational effectiveness. Computer technology is being used whenever possible; for example, a technique developed by Herman Chernoff was used to process data from a survey of employee attitudes into 18 different…

  8. Applying Cooperative Techniques in Teaching Problem Solving

    Directory of Open Access Journals (Sweden)

    Krisztina Barczi

    2013-12-01

    Full Text Available Teaching how to solve problems – from solving simple equations to solving difficult competition tasks – has been one of the greatest challenges for mathematics education for many years. Trying to find an effective method is an important educational task. Among others, the question arises as to whether a method in which students help each other might be useful. The present article describes part of an experiment that was designed to determine the effects of cooperative teaching techniques on the development of problem-solving skills.

  9. Basic principles of applied nuclear techniques

    International Nuclear Information System (INIS)

    Basson, J.K.

    1976-01-01

    The technological applications of radioactive isotopes and radiation in South Africa have grown steadily since the first consignment of man-made radioisotopes reached this country in 1948. By the end of 1975 there were 412 authorised non-medical organisations (327 industries) using hundreds of sealed sources as well as their fair share of the thousands of radioisotope consignments, annually either imported or produced locally (mainly for medical purposes). Consequently, it is necessary for South African technologists to understand the principles of radioactivity in order to appreciate the industrial applications of nuclear techniques [af

  10. Dosimetry techniques applied to thermoluminescent age estimation

    International Nuclear Information System (INIS)

    Erramli, H.

    1986-12-01

    The reliability and the ease of the field application of the measuring techniques of natural radioactivity dosimetry are studied. The natural radioactivity in minerals in composed of the internal dose deposited by alpha and beta radiations issued from the sample itself and the external dose deposited by gamma and cosmic radiations issued from the surroundings of the sample. Two technics for external dosimetry are examined in details. TL Dosimetry and field gamma dosimetry. Calibration and experimental conditions are presented. A new integrated dosimetric method for internal and external dose measure is proposed: the TL dosimeter is placed in the soil in exactly the same conditions as the sample ones, during a time long enough for the total dose evaluation [fr

  11. Tracer techniques applied to groundwater studies

    International Nuclear Information System (INIS)

    Sanchez, W.

    1975-01-01

    The determination of several aquifer characteristics, primarily in the satured zone, namely: porosity, permeability, transmissivity, dispersivity, direction and velocity of sub-surface water is presented. These techniques are based on artificial radioisotopes utilization. Only field determination of porosity are considered here and their advantage over laboratory measurements are: better representation of volume average, insensibility to local inhomogenities and no distortion of the structure due to sampling. The radioisotope dilution method is used to obtain an independent and direct measurement of the filtration velocity in a water-bearing formation under natural or induced hydraulic gradient. The velocity of the flow is usually calculated from Darcy's formula through the measurement of gradients and requires a knowledge of the permeability of the formation. The filtration velocity interpreted in conjunction with other parameters can, under favourable conditions, provide valuable information on the permeability, transmissibility and amount of water moving through an aquifer

  12. Technique applied in electrical power distribution for Satellite Launch Vehicle

    Directory of Open Access Journals (Sweden)

    João Maurício Rosário

    2010-09-01

    Full Text Available The Satellite Launch Vehicle electrical network, which is currently being developed in Brazil, is sub-divided for analysis in the following parts: Service Electrical Network, Controlling Electrical Network, Safety Electrical Network and Telemetry Electrical Network. During the pre-launching and launching phases, these electrical networks are associated electrically and mechanically to the structure of the vehicle. In order to succeed in the integration of these electrical networks it is necessary to employ techniques of electrical power distribution, which are proper to Launch Vehicle systems. This work presents the most important techniques to be considered in the characterization of the electrical power supply applied to Launch Vehicle systems. Such techniques are primarily designed to allow the electrical networks, when submitted to the single-phase fault to ground, to be able of keeping the power supply to the loads.

  13. Conversation Analysis in Applied Linguistics

    DEFF Research Database (Denmark)

    Kasper, Gabriele; Wagner, Johannes

    2014-01-01

    on applied CA, the application of basic CA's principles, methods, and findings to the study of social domains and practices that are interactionally constituted. We consider three strands—foundational, social problem oriented, and institutional applied CA—before turning to recent developments in CA research...... on learning and development. In conclusion, we address some emerging themes in the relationship of CA and applied linguistics, including the role of multilingualism, standard social science methods as research objects, CA's potential for direct social intervention, and increasing efforts to complement CA......For the last decade, conversation analysis (CA) has increasingly contributed to several established fields in applied linguistics. In this article, we will discuss its methodological contributions. The article distinguishes between basic and applied CA. Basic CA is a sociological endeavor concerned...

  14. Applied analysis and differential equations

    CERN Document Server

    Cârj, Ovidiu

    2007-01-01

    This volume contains refereed research articles written by experts in the field of applied analysis, differential equations and related topics. Well-known leading mathematicians worldwide and prominent young scientists cover a diverse range of topics, including the most exciting recent developments. A broad range of topics of recent interest are treated: existence, uniqueness, viability, asymptotic stability, viscosity solutions, controllability and numerical analysis for ODE, PDE and stochastic equations. The scope of the book is wide, ranging from pure mathematics to various applied fields such as classical mechanics, biomedicine, and population dynamics.

  15. Lessons learned in applying function analysis

    International Nuclear Information System (INIS)

    Mitchel, G.R.; Davey, E.; Basso, R.

    2001-01-01

    This paper summarizes the lessons learned in undertaking and applying function analysis based on the recent experience of utility, AECL and international design and assessment projects. Function analysis is an analytical technique that can be used to characterize and asses the functions of a system and is widely recognized as an essential component of a 'systematic' approach to design, on that integrated operational and user requirements into the standard design process. (author)

  16. Analytical techniques applied to study cultural heritage objects

    International Nuclear Information System (INIS)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N.

    2015-01-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  17. Analytical techniques applied to study cultural heritage objects

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N., E-mail: rizzutto@if.usp.br [Universidade de Sao Paulo (USP), SP (Brazil). Instituto de Fisica

    2015-07-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  18. Archaeometry: nuclear and conventional techniques applied to the archaeological research

    International Nuclear Information System (INIS)

    Esparza L, R.; Cardenas G, E.

    2005-01-01

    The book that now is presented is formed by twelve articles that approach from different perspective topics as the archaeological prospecting, the analysis of the pre hispanic and colonial ceramic, the obsidian and the mural painting, besides dating and questions about the data ordaining. Following the chronological order in which the exploration techniques and laboratory studies are required, there are presented in the first place the texts about the systematic and detailed study of the archaeological sites, later we pass to relative topics to the application of diverse nuclear techniques as PIXE, RBS, XRD, NAA, SEM, Moessbauer spectroscopy and other conventional techniques. The multidisciplinary is an aspect that highlights in this work, that which owes to the great specialization of the work that is presented even in the archaeological studies including in the open ground of the topography, mapping, excavation and, of course, in the laboratory tests. Most of the articles are the result of several years of investigation and it has been consigned in the responsibility of each article. The texts here gathered emphasize the technical aspects of each investigation, the modern compute systems applied to the prospecting and the archaeological mapping, the chemical and physical analysis of organic materials, of metal artifacts, of diverse rocks used in the pre hispanic epoch, of mural and ceramic paintings, characteristics that justly underline the potential of the collective works. (Author)

  19. Tensometry technique for X-ray diffraction in applied analysis of welding; Tensometria por tecnica de difracao de raios X aplicada na analise de soldagens

    Energy Technology Data Exchange (ETDEWEB)

    Turibus, S.N.; Caldas, F.C.M.; Miranda, D.M.; Monine, V.I.; Assis, J.T., E-mail: snturibus@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (IPRJ/UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico

    2010-07-01

    This paper presents the analysis of residual stress introduced in welding process. As the stress in a material can induce damages, it is necessary to have a method to identify this residual stress state. For this it was used the non-destructive X-ray diffraction technique to analyze two plates from A36 steel jointed by metal inert gas (MIG) welding. The stress measurements were made by the sin{sup 2{psi}} method in weld region of steel plates including analysis of longitudinal and transverse residual stresses in fusion zone, heat affected zone (HAZ) and base metal. To determine the stress distribution along the depth of the welded material it was used removing of superficial layers made by electropolishing. (author)

  20. Radiation measurement and inverse analysis techniques applied on the determination of the apparent mass diffusion coefficient for diverse contaminants and soil samples

    International Nuclear Information System (INIS)

    Rey Silva, D.V.F.M.; Oliveira, A.P.; Macacini, J.F.; Da Silva, N.C.; Cipriani, M.; Quinelato, A.L.

    2005-01-01

    Full text of publication follows: The study of the dispersion of radioactive materials in soils and in engineering barriers plays an important role in the safety analysis of nuclear waste repositories. In order to proceed with such kind of study the involved physical properties must be determined with precision, including the apparent mass diffusion coefficient, which is defined as the ratio between the effective mass diffusion coefficient and the retardation factor. Many different experimental and estimation techniques are available on the literature for the identification of the diffusion coefficient and this work describes the implementation of that developed by Pereira et al [1]. This technique is based on non-intrusive radiation measurements and the experimental setup consists of a cylindrical column filled with compacted media saturated with water. A radioactive contaminant is mixed with a portion of the media and then placed in the bottom of the column. Therefore, the contaminant will diffuse through the uncontaminated media due to the concentration gradient. A radiation detector is used to measure the number of counts, which is associated to the contaminant concentration, at several positions along the column during the experiment. Such measurements are then used to estimate the apparent diffusion coefficient of the contaminant in the porous media by inverse analysis. The inverse problem of parameter estimation is solved with the Levenberg-Marquart Method of minimization of the least-square norm. The experiment was optimized with respect to the number of measurement locations, frequency of measurements and duration of the experiment through the analysis of the sensitivity coefficients and by using a D-optimum approach. This setup is suitable for studying a great number of combinations of diverse contaminants and porous media varying in composition and compacting, with considerable easiness and reliable results, and it was chosen because that is the

  1. Construction and performance characterization of ion-selective electrodes for potentiometric determination of pseudoephedrine hydrochloride applying batch and flow injection analysis techniques.

    Science.gov (United States)

    Zayed, Sayed I M; Issa, Yousry M; Hussein, Ahmed

    2006-01-01

    New pseudoephedrine selective electrodes have been constructed of the conventional polymer membrane type by incorporation of pseudoephedrine-phosphotungstate (PE-PT) or pseudoephedrine-silicotungstate (PE-SiT) ion-associates in a poly vinyl chloride (PVC) membrane plasticized with dibutyl phthalate (DBP). The electrodes were fully characterized in terms of the membrane composition, temperature, and pH. The electrodes exhibited mean slopes of calibration graphs of 57.09 and 56.10 mV concentration decade(-1) of PECl at 25 degrees C for (PE-PT) and (PE-SiT) electrodes, respectively. The electrodes showed fast, stable, and near-Nernstian response over the concentration ranges 6.31 x 10(-6)-1.00 x 10(-2) and 5.00 x 10(-5)-1.00x10(-2) M in the case of PE-PT applying batch and flow injection (FI) analysis, respectively, and 1.00 x 10(-5)-1.00 x 10(-2) and 5.00 x 10(-5)-1.00x10(-2) M in the case of PE-SiT for batch and FI analysis system, respectively. Detection limit was 5.01x 10(-6) M for PE-PT electrode and 6.31x10(-6) M for PE-SiT electrode. The electrodes were successfully applied for the potentiometric determination of pseudoephedrine hydrochloride (PECl) in pharmaceutical preparations with mean recovery 101.13 +/- 0.85% and 100.77+0.79% in case of PE-PT applying batch and flow injection systems, respectively, and 100.75+0.85% and 100.79 +/- 0.77% in case of PE-SiT for batch and flow injection systems, respectively. The electrodes exhibited good selectivity for PECl with respect to a large number of inorganic cations, sugars and amino acids.

  2. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  3. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  4. Applying Metrological Techniques to Satellite Fundamental Climate Data Records

    Science.gov (United States)

    Woolliams, Emma R.; Mittaz, Jonathan PD; Merchant, Christopher J.; Hunt, Samuel E.; Harris, Peter M.

    2018-02-01

    Quantifying long-term environmental variability, including climatic trends, requires decadal-scale time series of observations. The reliability of such trend analysis depends on the long-term stability of the data record, and understanding the sources of uncertainty in historic, current and future sensors. We give a brief overview on how metrological techniques can be applied to historical satellite data sets. In particular we discuss the implications of error correlation at different spatial and temporal scales and the forms of such correlation and consider how uncertainty is propagated with partial correlation. We give a form of the Law of Propagation of Uncertainties that considers the propagation of uncertainties associated with common errors to give the covariance associated with Earth observations in different spectral channels.

  5. Chemical vapor deposition: A technique for applying protective coatings

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, T.C. Sr.; Bowman, M.G.

    1979-01-01

    Chemical vapor deposition is discussed as a technique for applying coatings for materials protection in energy systems. The fundamentals of the process are emphasized in order to establish a basis for understanding the relative advantages and limitations of the technique. Several examples of the successful application of CVD coating are described. 31 refs., and 18 figs.

  6. Mainstreaming gender and promoting intersectionality in Papua New Guinea's health policy: a triangulated analysis applying data-mining and content analytic techniques.

    Science.gov (United States)

    Lamprell, G; Braithwaite, J

    2017-04-20

    Gender mainstreaming is an approach to policy and planning that emphasizes equality between the sexes. It is the stated policy for gender equity in Papua New Guinea's (PNG) health sector, as well as all other sectors, and is enshrined in the policies of its biggest aid givers. However, there is criticism that gender mainstreaming's application has too often been technocratic and lacking in conceptual clarity not only in PNG but elsewhere. In the health sector this is further exacerbated by a traditional bio-medical approach, which is often paternalistic and insufficiently patient- and family-centered. This study analyses the policy attitudes toward gender in PNG's health sector using both data-mining and a traditional, summative content analysis. Our results show that gender is rarely mentioned. When it is, it is most often mentioned in relation to programs such as maternity and childcare for women, and elsewhere is applied technocratically. For PNG to promote greater levels of equity, the focus should first be on conceptualizing gender in a way that is meaningful for Papuans, taking into account the diversity of experiences and setting. Second, there should be greater focus on activists and civil society groups as the stakeholders most likely to make a difference in gender equity.

  7. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  8. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  9. Conversation Analysis and Applied Linguistics.

    Science.gov (United States)

    Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David

    2002-01-01

    Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)

  10. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  11. Satellite SAR interferometric techniques applied to emergency mapping

    Science.gov (United States)

    Stefanova Vassileva, Magdalena; Riccardi, Paolo; Lecci, Daniele; Giulio Tonolo, Fabio; Boccardo Boccardo, Piero; Chiesa, Giuliana; Angeluccetti, Irene

    2017-04-01

    This paper aim to investigate the capabilities of the currently available SAR interferometric algorithms in the field of emergency mapping. Several tests have been performed exploiting the Copernicus Sentinel-1 data using the COTS software ENVI/SARscape 5.3. Emergency Mapping can be defined as "creation of maps, geo-information products and spatial analyses dedicated to providing situational awareness emergency management and immediate crisis information for response by means of extraction of reference (pre-event) and crisis (post-event) geographic information/data from satellite or aerial imagery". The conventional differential SAR interferometric technique (DInSAR) and the two currently available multi-temporal SAR interferometric approaches, i.e. Permanent Scatterer Interferometry (PSI) and Small BAseline Subset (SBAS), have been applied to provide crisis information useful for the emergency management activities. Depending on the considered Emergency Management phase, it may be distinguished between rapid mapping, i.e. fast provision of geospatial data regarding the area affected for the immediate emergency response, and monitoring mapping, i.e. detection of phenomena for risk prevention and mitigation activities. In order to evaluate the potential and limitations of the aforementioned SAR interferometric approaches for the specific rapid and monitoring mapping application, five main factors have been taken into account: crisis information extracted, input data required, processing time and expected accuracy. The results highlight that DInSAR has the capacity to delineate areas affected by large and sudden deformations and fulfills most of the immediate response requirements. The main limiting factor of interferometry is the availability of suitable SAR acquisition immediately after the event (e.g. Sentinel-1 mission characterized by 6-day revisiting time may not always satisfy the immediate emergency request). PSI and SBAS techniques are suitable to produce

  12. [Technique and value of direct MR arthrography applying articular distraction].

    Science.gov (United States)

    Becce, Fabio; Wettstein, Michael; Guntern, Daniel; Mouhsine, Elyazid; Palhais, Nuno; Theumann, Nicolas

    2010-02-24

    Direct MR arthrography has a better diagnostic accuracy than MR imaging alone. However, contrast material is not always homogeneously distributed in the articular space. Lesions of cartilage surfaces or intra-articular soft tissues can thus be misdiagnosed. Concomitant application of axial traction during MR arthrography leads to articular distraction. This enables better distribution of contrast material in the joint and better delineation of intra-articular structures. Therefore, this technique improves detection of cartilage lesions. Moreover, the axial stress applied on articular structures may reveal lesions invisible on MR images without traction. Based on our clinical experience, we believe that this relatively unknown technique is promising and should be further developed.

  13. Techniques of remote sensing applied to the environmental analysis of part of an aquifer located in the São José dos Campos Region sp, Brazil.

    Science.gov (United States)

    Bressan, Mariana Affonseca; Dos Anjos, Célio Eustáquio

    2003-05-01

    The anthropogenic activity on the surface can modify and introduce new mechanisms of recharging the groundwater system, modifying the tax, the frequency and the quality of recharge of underground waters. The understanding of these mechanisms and the correct evaluation of such modifications are fundamental in determining the vulnerability of groundwater contamination. The groundwater flow of the South Paraíba Compartment, in the region of São José dos Campos, São Paulo, is directly related to structural features of the Taubaté Basin and, therefore, the analysis of its behaviour enhances the understanding of tectonic structure. The methodology adopted for this work consists in pre-processing and processing of the satellite images, visual interpretation of HSI products, field work and data integration. The derivation of the main structural features was based on visual analysis of the texture elements of drainage, and the relief in sedimentary and crystalline rocks. Statistical analysis of the feature densities and the metric-geometric relations between the analysed elements have been conducted. The crystalline rocks, on which the sediments were laying, conditions and controls the structural arrangement of sedimentary formations. The formation of the South Paraíba Grabén is associated with Cenozoic distensive movement which reactivated old features of crust weakness and generated previous cycles with normal characteristics. The environmental analysis is based on the integration of the existing methodology to characterise vulnerability of an universal pollutant and density fracture zone. The digital integration was processed using GIS (Geographic Information System) to delineate five defined vulnerability classes. The hydrogeological settings were analysed in each thematic map and, using fuzzy logic, an index for each different vulnerability class was compiled. Evidence maps could be combined in a series of steps using map algebra.

  14. Applying CFD in the analysis of heavy oil - water two-phase flow in joints by using core annular flow technique

    Directory of Open Access Journals (Sweden)

    T Andrade

    2016-09-01

    Full Text Available In the oil industry the multiphase flow occur throughout the production chain, from reservoir rock until separation units through the production column, risers and pipelines. During the whole process the fluid flows through the horizontal pipes, curves, connections and T joints. Today, technological and economic challenges facing the oil industry is related to heavy oil transportation due to its unfavourable characteristics such as high viscosity and high density that provokes high pressure drop along the flow. The coreflow technique consists in the injection of small amounts of water into the pipe to form a ring of water between the oil and the wall of the pipe which provides the reduction of friction pressure drop along the flow. This paper aim to model and simulate the transient two-phase flow (water-heavy oil in a horizontal pipe and T joint by numerical simulation using the software ANSYS CFX® Release 12.0. Results of pressure and volumetric fraction distribution inside the horizontal pipe and T joint are presented and analysed.

  15. Applying of USB interface technique in nuclear spectrum acquisition system

    International Nuclear Information System (INIS)

    Zhou Jianbin; Huang Jinhua

    2004-01-01

    This paper introduces applying of USB technique and constructing nuclear spectrum acquisition system via PC's USB interface. The authors choose the USB component USB100 module and the W77E58μc to do the key work. It's easy to apply USB interface technique, when USB100 module is used. USB100 module can be treated as a common I/O component for the μc controller, and can be treated as a communication interface (COM) when connected to PC' USB interface. It's easy to modify the PC's program for the new system with USB100 module. The authors can smoothly change from ISA, RS232 bus to USB bus. (authors)

  16. Analysis of a finite-difference and a Galerkin technique applied to the simulation of advection and diffusion of air pollutants from a line source

    International Nuclear Information System (INIS)

    Runca, E.; Melli, P.; Sardei, F.

    1985-01-01

    A finite-difference scheme and a Galerkin scheme are compared with respect to a very accurate solution describing time-dependent advection and diffusion of air pollutants from a line source in an atmosphere vertically stratified and limited by an inversion layer. The accurate solution was achieved by applying the finite-difference scheme on a very refined grid with a very small time step. The grid size and time step were defined according to stability and accuracy criteria discussed in the text. It is found that for the problem considered the two methods can be considered equally accurate. However, the Galerkin method gives a better approximation in the vicinity of the source. This was assumed to be partly due to the different way the source term is taken into account in the two methods. Improvement of the accuracy of the finite-difference scheme was achieved by approximating, at every step, the contribution of the source term by a Gaussian puff moving and diffusing with the velocity and diffusivity of the source location, instead of utilizing a stepwise function for the numerical approximation of the delta function representing the source term

  17. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  18. Diagonal ordering operation technique applied to Morse oscillator

    Energy Technology Data Exchange (ETDEWEB)

    Popov, Dušan, E-mail: dusan_popov@yahoo.co.uk [Politehnica University Timisoara, Department of Physical Foundations of Engineering, Bd. V. Parvan No. 2, 300223 Timisoara (Romania); Dong, Shi-Hai [CIDETEC, Instituto Politecnico Nacional, Unidad Profesional Adolfo Lopez Mateos, Mexico D.F. 07700 (Mexico); Popov, Miodrag [Politehnica University Timisoara, Department of Steel Structures and Building Mechanics, Traian Lalescu Street, No. 2/A, 300223 Timisoara (Romania)

    2015-11-15

    We generalize the technique called as the integration within a normally ordered product (IWOP) of operators referring to the creation and annihilation operators of the harmonic oscillator coherent states to a new operatorial approach, i.e. the diagonal ordering operation technique (DOOT) about the calculations connected with the normally ordered product of generalized creation and annihilation operators that generate the generalized hypergeometric coherent states. We apply this technique to the coherent states of the Morse oscillator including the mixed (thermal) state case and get the well-known results achieved by other methods in the corresponding coherent state representation. Also, in the last section we construct the coherent states for the continuous dynamics of the Morse oscillator by using two new methods: the discrete–continuous limit, respectively by solving a finite difference equation. Finally, we construct the coherent states corresponding to the whole Morse spectrum (discrete plus continuous) and demonstrate their properties according the Klauder’s prescriptions.

  19. Applying recursive numerical integration techniques for solving high dimensional integrals

    International Nuclear Information System (INIS)

    Ammon, Andreas; Genz, Alan; Hartung, Tobias; Jansen, Karl; Volmer, Julia; Leoevey, Hernan

    2016-11-01

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  20. Applying recursive numerical integration techniques for solving high dimensional integrals

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, Andreas [IVU Traffic Technologies AG, Berlin (Germany); Genz, Alan [Washington State Univ., Pullman, WA (United States). Dept. of Mathematics; Hartung, Tobias [King' s College, London (United Kingdom). Dept. of Mathematics; Jansen, Karl; Volmer, Julia [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leoevey, Hernan [Humboldt Univ. Berlin (Germany). Inst. fuer Mathematik

    2016-11-15

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  1. Ion backscattering techniques applied in materials science research

    International Nuclear Information System (INIS)

    Sood, D.K.

    1978-01-01

    The applications of Ion Backscattering Technique (IBT) to material analysis have expanded rapidly during the last decade. It is now regarded as an analysis tool indispensable for a versatile materials research program. The technique consists of simply shooting a beam of monoenergetic ions (usually 4 He + ions at about 2 MeV) onto a target, and measuring their energy distribution after backscattering at a fixed angle. Simple Rutherford scattering analysis of the backscattered ion spectrum yields information on the mass, the absolute amount and the depth profile of elements present upto a few microns of the target surface. The technique is nondestructive, quick, quantitative and the only known method of analysis which gives quantitative results without recourse to calibration standards. Its major limitations are the inability to separate elements of similar mass and a complete absence of chemical-binding information. A typical experimental set up and spectrum analysis have been described. Examples, some of them based on the work at the Bhabha Atomic Research Centre, Bombay, have been given to illustrate the applications of this technique to semiconductor technology, thin film materials science and nuclear energy materials. Limitations of IBT have been illustrated and a few remedies to partly overcome these limitations are presented. (auth.)

  2. Volcanic Monitoring Techniques Applied to Controlled Fragmentation Experiments

    Science.gov (United States)

    Kueppers, U.; Alatorre-Ibarguengoitia, M. A.; Hort, M. K.; Kremers, S.; Meier, K.; Scharff, L.; Scheu, B.; Taddeucci, J.; Dingwell, D. B.

    2010-12-01

    ejection and that the evaluated results were mostly in good agreement. We will discuss the technical difficulties encountered, e.g. the temporal synchronisation of the different techniques. Furthermore, the internal data management of the DR prevents at present a continuous recording and only a limited number of snapshots is stored. Nonetheless, in at least three experiments the onset of particle ejection was measured by all different techniques and gave coherent results of up to 100 m/s. This is a very encouraging result and of paramount importance as it proofs the applicability of these independent methods to volcano monitoring. Each method by itself may enhance our understanding of the pressurisation state of a volcano, an essential factor in ballistic hazard evaluation and eruption energy estimation. Technical adaptations of the DR will overcome the encountered problems and allow a more refined data analysis during the next campaign.

  3. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  4. Applying Supervised Opinion Mining Techniques on Online User Reviews

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2012-01-01

    Full Text Available In recent years, the spectacular development of web technologies, lead to an enormous quantity of user generated information in online systems. This large amount of information on web platforms make them viable for use as data sources, in applications based on opinion mining and sentiment analysis. The paper proposes an algorithm for detecting sentiments on movie user reviews, based on naive Bayes classifier. We make an analysis of the opinion mining domain, techniques used in sentiment analysis and its applicability. We implemented the proposed algorithm and we tested its performance, and suggested directions of development.

  5. Bioremediation techniques applied to aqueous media contaminated with mercury.

    Science.gov (United States)

    Velásquez-Riaño, Möritz; Benavides-Otaya, Holman D

    2016-12-01

    In recent years, the environmental and human health impacts of mercury contamination have driven the search for alternative, eco-efficient techniques different from the traditional physicochemical methods for treating this metal. One of these alternative processes is bioremediation. A comprehensive analysis of the different variables that can affect this process is presented. It focuses on determining the effectiveness of different techniques of bioremediation, with a specific consideration of three variables: the removal percentage, time needed for bioremediation and initial concentration of mercury to be treated in an aqueous medium.

  6. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  7. Three-dimensional integrated CAE system applying computer graphic technique

    International Nuclear Information System (INIS)

    Kato, Toshisada; Tanaka, Kazuo; Akitomo, Norio; Obata, Tokayasu.

    1991-01-01

    A three-dimensional CAE system for nuclear power plant design is presented. This system utilizes high-speed computer graphic techniques for the plant design review, and an integrated engineering database for handling the large amount of nuclear power plant engineering data in a unified data format. Applying this system makes it possible to construct a nuclear power plant using only computer data from the basic design phase to the manufacturing phase, and it increases the productivity and reliability of the nuclear power plants. (author)

  8. Essentials of applied dynamic analysis

    CERN Document Server

    Jia, Junbo

    2014-01-01

    This book presents up-to-date knowledge of dynamic analysis in engineering world. To facilitate the understanding of the topics by readers with various backgrounds, general principles are linked to their applications from different angles. Special interesting topics such as statistics of motions and loading, damping modeling and measurement, nonlinear dynamics, fatigue assessment, vibration and buckling under axial loading, structural health monitoring, human body vibrations, and vehicle-structure interactions etc., are also presented. The target readers include industry professionals in civil, marine and mechanical engineering, as well as researchers and students in this area.

  9. Applied systems analysis. No. 22

    International Nuclear Information System (INIS)

    1980-12-01

    Based on a detailed analysis of demands in the area Cologne/Frankfurt, the amount of the system products for this region were ascertained, which under consideration of technical conditions and entrepreneurial aspects seemed to be disposable at cost equality with competative energy supplies. Based on these data, the technical components of the system, location and piping were fixed and first- and operating costs were determined. For a judgement of the economics, the key numbers, cash value, internal rate of interest and cost recovery rate were determined from the difference of costs between the nuclear long distance energy system and alternative facilities. Furthermore specific production cost, associated prices and contribution margin were presented for each product. (orig.) [de

  10. Applying field mapping refractive beam shapers to improve holographic techniques

    Science.gov (United States)

    Laskin, Alexander; Williams, Gavin; McWilliam, Richard; Laskin, Vadim

    2012-03-01

    Performance of various holographic techniques can be essentially improved by homogenizing the intensity profile of the laser beam with using beam shaping optics, for example, the achromatic field mapping refractive beam shapers like πShaper. The operational principle of these devices presumes transformation of laser beam intensity from Gaussian to flattop one with high flatness of output wavefront, saving of beam consistency, providing collimated output beam of low divergence, high transmittance, extended depth of field, negligible residual wave aberration, and achromatic design provides capability to work with several laser sources with different wavelengths simultaneously. Applying of these beam shapers brings serious benefits to the Spatial Light Modulator based techniques like Computer Generated Holography or Dot-Matrix mastering of security holograms since uniform illumination of an SLM allows simplifying mathematical calculations and increasing predictability and reliability of the imaging results. Another example is multicolour Denisyuk holography when the achromatic πShaper provides uniform illumination of a field at various wavelengths simultaneously. This paper will describe some design basics of the field mapping refractive beam shapers and optical layouts of their applying in holographic systems. Examples of real implementations and experimental results will be presented as well.

  11. Rare event techniques applied in the Rasmussen study

    International Nuclear Information System (INIS)

    Vesely, W.E.

    1977-01-01

    The Rasmussen Study estimated public risks from commercial nuclear power plant accidents, and therefore the statistics of rare events had to be treated. Two types of rare events were specifically handled, those rare events which were probabilistically rare events and those which were statistically rare events. Four techniques were used to estimate probabilities of rare events. These techniques were aggregating data samples, discretizing ''continuous'' events, extrapolating from minor to catastrophic severities, and decomposing events using event trees and fault trees. In aggregating or combining data the goal was to enlarge the data sample so that the rare event was no longer rare, i.e., so that the enlarged data sample contained one or more occurrences of the event of interest. This aggregation gave rise to random variable treatments of failure rates, occurrence frequencies, and other characteristics estimated from data. This random variable treatment can be interpreted as being comparable to an empirical Bayes technique or a Bayesian technique. In the discretizing event technique, events of a detailed nature were grouped together into a grosser event for purposes of analysis as well as for data collection. The treatment of data characteristics as random variables helped to account for the uncertainties arising from this discretizing. In the severity extrapolation technique a severity variable was associated with each event occurrence for the purpose of predicting probabilities of catastrophic occurrences. Tail behaviors of distributions therefore needed to be considered. Finally, event trees and fault trees were used to express accident occurrences and system failures in terms of more basic events for which data existed. Common mode failures and general dependencies therefore needed to be treated. 2 figures

  12. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  13. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  14. Positron Plasma Control Techniques Applied to Studies of Cold Antihydrogen

    CERN Document Server

    Funakoshi, Ryo

    2003-01-01

    In the year 2002, two experiments at CERN succeeded in producing cold antihydrogen atoms, first ATHENA and subsequently ATRAP. Following on these results, it is now feasible to use antihydrogen to study the properties of antimatter. In the ATHENA experiment, the cold antihydrogen atoms are produced by mixing large amounts of antiprotons and positrons in a nested Penning trap. The complicated behaviors of the charged particles are controlled and monitored by plasma manipulation techniques. The antihydrogen events are studied using position sensitive detectors and the evidence of production of antihydrogen atoms is separated out with the help of analysis software. This thesis covers the first production of cold antihydrogen in the first section as well as the further studies of cold antihydrogen performed by using the plasma control techniques in the second section.

  15. Applying AI techniques to improve alarm display effectiveness

    International Nuclear Information System (INIS)

    Gross, J.M.; Birrer, S.A.; Crosberg, D.R.

    1987-01-01

    The Alarm Filtering System (AFS) addresses the problem of information overload in a control room during abnormal operations. Since operators can miss vital information during these periods, systems which emphasize important messages are beneficial. AFS uses the artificial intelligence (AI) technique of object-oriented programming to filter and dynamically prioritize alarm messages. When an alarm's status changes, AFS determines the relative importance of that change according to the current process state. AFS bases that relative importance on relationships the newly changed alarm has with other activated alarms. Evaluations of a alarm importance take place without regard to the activation sequence of alarm signals. The United States Department of Energy has applied for a patent on the approach used in this software. The approach was originally developed by EG and G Idaho for a nuclear reactor control room

  16. Airflow measurement techniques applied to radon mitigation problems

    International Nuclear Information System (INIS)

    Harrje, D.T.; Gadsby, K.J.

    1989-01-01

    During the past decade a multitude of diagnostic procedures associated with the evaluation of air infiltration and air leakage sites have been developed. The spirit of international cooperation and exchange of ideas within the AIC-AIVC conferences has greatly facilitated the adoption and use of these measurement techniques in the countries participating in Annex V. But wide application of such diagnostic methods are not limited to air infiltration alone. The subject of this paper concerns the ways to evaluate and improve radon reduction in buildings using diagnostic methods directly related to developments familiar to the AIVC. Radon problems are certainly not unique to the United States, and the methods described here have to a degree been applied by researchers of other countries faced with similar problems. The radon problem involves more than a harmful pollutant of the living spaces of our buildings -- it also involves energy to operate radon removal equipment and the loss of interior conditioned air as a direct result. The techniques used for air infiltration evaluation will be shown to be very useful in dealing with the radon mitigation challenge. 10 refs., 7 figs., 1 tab

  17. Applying advanced digital signal processing techniques in industrial radioisotopes applications

    International Nuclear Information System (INIS)

    Mahmoud, H.K.A.E.

    2012-01-01

    Radioisotopes can be used to obtain signals or images in order to recognize the information inside the industrial systems. The main problems of using these techniques are the difficulty of identification of the obtained signals or images and the requirement of skilled experts for the interpretation process of the output data of these applications. Now, the interpretation of the output data from these applications is performed mainly manually, depending heavily on the skills and the experience of trained operators. This process is time consuming and the results typically suffer from inconsistency and errors. The objective of the thesis is to apply the advanced digital signal processing techniques for improving the treatment and the interpretation of the output data from the different Industrial Radioisotopes Applications (IRA). This thesis focuses on two IRA; the Residence Time Distribution (RTD) measurement and the defect inspection of welded pipes using a gamma source (gamma radiography). In RTD measurement application, this thesis presents methods for signal pre-processing and modeling of the RTD signals. Simulation results have been presented for two case studies. The first case study is a laboratory experiment for measuring the RTD in a water flow rig. The second case study is an experiment for measuring the RTD in a phosphate production unit. The thesis proposes an approach for RTD signal identification in the presence of noise. In this approach, after signal processing, the Mel Frequency Cepstral Coefficients (MFCCs) and polynomial coefficients are extracted from the processed signal or from one of its transforms. The Discrete Wavelet Transform (DWT), Discrete Cosine Transform (DCT), and Discrete Sine Transform (DST) have been tested and compared for efficient feature extraction. Neural networks have been used for matching of the extracted features. Furthermore, the Power Density Spectrum (PDS) of the RTD signal has been also used instead of the discrete

  18. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  19. [Molecular techniques applied in species identification of Toxocara].

    Science.gov (United States)

    Fogt, Renata

    2006-01-01

    Toxocarosis is still an important and actual problem in human medicine. It can manifest as visceral (VLM), ocular (OLM) or covert (CT) larva migrans syndroms. Complicated life cycle of Toxocara, lack of easy and practical methods of species differentiation of the adult nematode and embarrassing in recognition of the infection in definitive hosts create difficulties in fighting with the infection. Although studies on human toxocarosis have been continued for over 50 years there is no conclusive answer, which of species--T. canis or T. cati constitutes a greater risk of transmission of the nematode to man. Neither blood serological examinations nor microscopic observations of the morphological features of the nematode give the satisfied answer on the question. Since the 90-ths molecular methods were developed for species identification and became useful tools being widely applied in parasitological diagnosis. This paper cover the survey of methods of DNA analyses used for identification of Toxocara species. The review may be helpful for researchers focused on Toxocara and toxocarosis as well as on detection of new species. The following techniques are described: PCR (Polymerase Chain Reaction), RFLP (Restriction Fragment Length Polymorphism), RAPD (Random Amplified Polymorphic DNA) and SSCP (Single Strand Conformation Polymorphism).

  20. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  1. Concept analysis of culture applied to nursing.

    Science.gov (United States)

    Marzilli, Colleen

    2014-01-01

    Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.

  2. Non destructive assay techniques applied to nuclear materials

    International Nuclear Information System (INIS)

    Gavron, A.

    2001-01-01

    Nondestructive assay is a suite of techniques that has matured and become precise, easily implementable, and remotely usable. These techniques provide elaborate safeguards of nuclear material by providing the necessary information for materials accounting. NDA techniques are ubiquitous, reliable, essentially tamper proof, and simple to use. They make the world a safer place to live in, and they make nuclear energy viable. (author)

  3. Caldwell University's Department of Applied Behavior Analysis.

    Science.gov (United States)

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis.

  4. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  5. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  6. Semantic Data And Visualization Techniques Applied To Geologic Field Mapping

    Science.gov (United States)

    Houser, P. I. Q.; Royo-Leon, M.; Munoz, R.; Estrada, E.; Villanueva-Rosales, N.; Pennington, D. D.

    2015-12-01

    Geologic field mapping involves the use of technology before, during, and after visiting a site. Geologists utilize hardware such as Global Positioning Systems (GPS) connected to mobile computing platforms such as tablets that include software such as ESRI's ArcPad and other software to produce maps and figures for a final analysis and report. Hand written field notes contain important information and drawings or sketches of specific areas within the field study. Our goal is to collect and geo-tag final and raw field data into a cyber-infrastructure environment with an ontology that allows for large data processing, visualization, sharing, and searching, aiding in connecting field research with prior research in the same area and/or aid with experiment replication. Online searches of a specific field area return results such as weather data from NOAA and QuakeML seismic data from USGS. These results that can then be saved to a field mobile device and searched while in the field where there is no Internet connection. To accomplish this we created the GeoField ontology service using the Web Ontology Language (OWL) and Protégé software. Advanced queries on the dataset can be made using reasoning capabilities can be supported that go beyond a standard database service. These improvements include the automated discovery of data relevant to a specific field site and visualization techniques aimed at enhancing analysis and collaboration while in the field by draping data over mobile views of the site using augmented reality. A case study is being performed at University of Texas at El Paso's Indio Mountains Research Station located near Van Horn, Texas, an active multi-disciplinary field study site. The user can interactively move the camera around the study site and view their data digitally. Geologist's can check their data against the site in real-time and improve collaboration with another person as both parties have the same interactive view of the data.

  7. Building an applied activation analysis centre

    International Nuclear Information System (INIS)

    Bartosek, J.; Kasparec, I.; Masek, J.

    1972-01-01

    Requirements are defined and all available background material is reported and discussed for the building up of a centre of applied activation analysis in Czechoslovakia. A detailed analysis of potential users and the centre's envisaged availability is also presented as part of the submitted study. A brief economic analysis is annexed. The study covers the situation up to the end of 1972. (J.K.)

  8. Digital prototyping technique applied for redesigning plastic products

    Science.gov (United States)

    Pop, A.; Andrei, A.

    2015-11-01

    After products are on the market for some time, they often need to be redesigned to meet new market requirements. New products are generally derived from similar but outdated products. Redesigning a product is an important part of the production and development process. The purpose of this paper is to show that using modern technology, like Digital Prototyping in industry is an effective way to produce new products. This paper tries to demonstrate and highlight the effectiveness of the concept of Digital Prototyping, both to reduce the design time of a new product, but also the costs required for implementing this step. The results of this paper show that using Digital Prototyping techniques in designing a new product from an existing one available on the market mould offers a significantly manufacturing time and cost reduction. The ability to simulate and test a new product with modern CAD-CAM programs in all aspects of production (designing of the 3D model, simulation of the structural resistance, analysis of the injection process and beautification) offers a helpful tool for engineers. The whole process can be realised by one skilled engineer very fast and effective.

  9. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  10. Photoacoustic technique applied to the study of skin and leather

    International Nuclear Information System (INIS)

    Vargas, M.; Varela, J.; Hernandez, L.; Gonzalez, A.

    1998-01-01

    In this paper the photoacoustic technique is used in bull skin for the determination of thermal and optical properties as a function of the tanning process steps. Our results show that the photoacoustic technique is sensitive to the study of physical changes in this kind of material due to the tanning process

  11. NEW TECHNIQUES APPLIED IN ECONOMICS. ARTIFICIAL NEURAL NETWORK

    Directory of Open Access Journals (Sweden)

    Constantin Ilie

    2009-05-01

    Full Text Available The present paper has the objective to inform the public regarding the use of new techniques for the modeling, simulate and forecast of system from different field of activity. One of those techniques is Artificial Neural Network, one of the artificial in

  12. Biomechanical study of the funnel technique applied in thoracic ...

    African Journals Online (AJOL)

    of vertebra was made for injury model of anterior and central column ... data were collected to eliminate creep and relaxation of soft tissues in .... 3 Pullout strength curve for Magerl technique (A) and Funnel technique (B). 210x164mm (72 x 72 ...

  13. X-ray fluorescence spectrometry applied to soil analysis

    International Nuclear Information System (INIS)

    Salvador, Vera Lucia Ribeiro; Sato, Ivone Mulako; Scapin Junior, Wilson Santo; Scapin, Marcos Antonio; Imakima, Kengo

    1997-01-01

    This paper studies the X-ray fluorescence spectrometry applied to the soil analysis. A comparative study of the WD-XRFS and ED-XRFS techniques was carried out by using the following soil samples: SL-1, SOIL-7 and marine sediment SD-M-2/TM, from IAEA, and clay, JG-1a from Geological Survey of Japan (GSJ)

  14. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  15. Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  16. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    Science.gov (United States)

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  17. Applying decision-making techniques to Civil Engineering Projects

    Directory of Open Access Journals (Sweden)

    Fam F. Abdel-malak

    2017-12-01

    Full Text Available Multi-Criteria Decision-Making (MCDM techniques are found to be useful tools in project managers’ hands to overcome decision-making (DM problems in Civil Engineering Projects (CEPs. The main contribution of this paper includes selecting and studying the popular MCDM techniques that uses different and wide ranges of data types in CEPs. A detailed study including advantages and pitfalls of using the Analytic Hierarchy Process (AHP and Fuzzy Technique for Order of Preference by Similarity to Ideal Solution (Fuzzy TOPSIS is introduced. Those two techniques are selected for the purpose of forming a package that covers most available data types in CEPs. The results indicated that AHP has a structure which simplifies complicated problems, while Fuzzy TOPSIS uses the advantages of linguistic variables to solve the issue of undocumented data and ill-defined problems. Furthermore, AHP is a simple technique that depends on pairwise comparisons of factors and natural attributes, beside it is preferable for widely spread hierarchies. On the other hand, Fuzzy TOPSIS needs more information but works well for the one-tier decision tree as well as it shows more flexibility to work in fuzzy environments. The two techniques have the facility to be integrated and combined in a new module to support most of the decisions required in CEPs. Keywords: Decision-making, AHP, Fuzzy TOPSIS, CBA, Civil Engineering Projects

  18. Applied surface analysis of metal materials

    International Nuclear Information System (INIS)

    Weiss, Z.

    1987-01-01

    The applications of surface analytical techniques in the solution of technological problems in metalurgy and engineering are reviewed. Some important application areas such as corrosion, grain boundary segregation and metallurgical coatings are presented together with specific requirements for the type of information which is necessary for solving particular problems. The techniques discussed include: electron spectroscopies (Auger Electron Spectroscopy, Electron Spectroscopy for Chemical Analysis), ion spectroscopies (Secondary Ion Mass Spectrometry, Ion Scattering Spectroscopy), Rutherford Back-Scattering, nuclear reaction analysis, optical methods (Glow Discharge Optical Emission Spectrometry), ellipsometry, infrared and Raman spectroscopy, the Moessbauer spectroscopy and methods of consumptive depth profile analysis. Principles and analytical features of these methods are demonstrated and examples of their applications to metallurgy are taken from recent literature. (author). 4 figs., 2 tabs., 112 refs

  19. Object oriented programming techniques applied to device access and control

    International Nuclear Information System (INIS)

    Goetz, A.; Klotz, W.D.; Meyer, J.

    1992-01-01

    In this paper a model, called the device server model, has been presented for solving the problem of device access and control faced by all control systems. Object Oriented Programming techniques were used to achieve a powerful yet flexible solution. The model provides a solution to the problem which hides device dependancies. It defines a software framework which has to be respected by implementors of device classes - this is very useful for developing groupware. The decision to implement remote access in the root class means that device servers can be easily integrated in a distributed control system. A lot of the advantages and features of the device server model are due to the adoption of OOP techniques. The main conclusion that can be drawn from this paper is that 1. the device access and control problem is adapted to being solved with OOP techniques, 2. OOP techniques offer a distinct advantage over traditional programming techniques for solving the device access problem. (J.P.N.)

  20. Diagnostic techniques applied in geostatistics for agricultural data analysis Técnicas de diagnóstico utilizadas em geoestatística para análise de dados agrícolas

    Directory of Open Access Journals (Sweden)

    Joelmir André Borssoi

    2009-12-01

    Full Text Available The structural modeling of spatial dependence, using a geostatistical approach, is an indispensable tool to determine parameters that define this structure, applied on interpolation of values at unsampled points by kriging techniques. However, the estimation of parameters can be greatly affected by the presence of atypical observations in sampled data. The purpose of this study was to use diagnostic techniques in Gaussian spatial linear models in geostatistics to evaluate the sensitivity of maximum likelihood and restrict maximum likelihood estimators to small perturbations in these data. For this purpose, studies with simulated and experimental data were conducted. Results with simulated data showed that the diagnostic techniques were efficient to identify the perturbation in data. The results with real data indicated that atypical values among the sampled data may have a strong influence on thematic maps, thus changing the spatial dependence structure. The application of diagnostic techniques should be part of any geostatistical analysis, to ensure a better quality of the information from thematic maps.A modelagem da estrutura de dependência espacial pela abordagem da geoestatística é de fundamental importância para a definição de parâmetros que definem essa estrutura e que são utilizados na interpolação de valores em locais não amostrados, pela técnica de krigagem. Entretanto, a estimação de parâmetros pode ser muito alterada pela presença de observações atípicas nos dados amostrados. O desenvolvimento deste trabalho teve por objetivo utilizar técnicas de diagnóstico em modelos espaciais lineares gaussianos, empregados em geoestatística, para avaliar a sensibilidade dos estimadores de máxima verossimilhança e máxima verossimilhança restrita a pequenas perturbações nos dados. Foram realizados estudos de dados simulados e experimentais. O estudo com dados simulados mostrou que as técnicas de diagnóstico foram

  1. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  2. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    Science.gov (United States)

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic

  3. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  4. Meta-analysis in applied ecology.

    Science.gov (United States)

    Stewart, Gavin

    2010-02-23

    This overview examines research synthesis in applied ecology and conservation. Vote counting and pooling unweighted averages are widespread despite the superiority of syntheses based on weighted combination of effects. Such analyses allow exploration of methodological uncertainty in addition to consistency of effects across species, space and time, but exploring heterogeneity remains controversial. Meta-analyses are required to generalize in ecology, and to inform evidence-based decision-making, but the more sophisticated statistical techniques and registers of research used in other disciplines must be employed in ecology to fully realize their benefits.

  5. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  6. X-diffraction technique applied for nano system metrology

    International Nuclear Information System (INIS)

    Kuznetsov, Alexei Yu.; Machado, Rogerio; Robertis, Eveline de; Campos, Andrea P.C.; Archanjo, Braulio S.; Gomes, Lincoln S.; Achete, Carlos A.

    2009-01-01

    The application of nano materials are fast growing in all industrial sectors, with a strong necessity in nano metrology and normalizing in the nano material area. The great potential of the X-ray diffraction technique in this field is illustrated at the example of metals, metal oxides and pharmaceuticals

  7. The ordering operator technique applied to open systems

    International Nuclear Information System (INIS)

    Pedrosa, I.A.; Baseia, B.

    1982-01-01

    A normal ordering technique and the coherent representation are used to discribe the time evolution of an open system of a single oscillator, linearly coupled with an infinite number of reservoir oscillators and it is shown how to include the dissipation and get the exponential decay. (Author) [pt

  8. Eddy current technique applied to automated tube profilometry

    International Nuclear Information System (INIS)

    Dobbeni, D.; Melsen, C. van

    1982-01-01

    The use of eddy current methods in the first totally automated pre-service inspection of the internal diameter of PWR steam generator tubes is described. The technique was developed at Laborelec, the Belgian Laboratory of the Electricity Supply Industry. Details are given of the data acquisition system and of the automated manipulator. Representative tube profiles are illustrated. (U.K.)

  9. Flash radiographic technique applied to fuel injector sprays

    International Nuclear Information System (INIS)

    Vantine, H.C.

    1977-01-01

    A flash radiographic technique, using 50 ns exposure times, was used to study the pattern and density distribution of a fuel injector spray. The experimental apparatus and method are described. An 85 kVp flash x-ray generator, designed and fabricated at the Lawrence Livermore Laboratory, is utilized. Radiographic images, recorded on standard x-ray films, are digitized and computer processed

  10. Machine-learning techniques applied to antibacterial drug discovery.

    Science.gov (United States)

    Durrant, Jacob D; Amaro, Rommie E

    2015-01-01

    The emergence of drug-resistant bacteria threatens to revert humanity back to the preantibiotic era. Even now, multidrug-resistant bacterial infections annually result in millions of hospital days, billions in healthcare costs, and, most importantly, tens of thousands of lives lost. As many pharmaceutical companies have abandoned antibiotic development in search of more lucrative therapeutics, academic researchers are uniquely positioned to fill the pipeline. Traditional high-throughput screens and lead-optimization efforts are expensive and labor intensive. Computer-aided drug-discovery techniques, which are cheaper and faster, can accelerate the identification of novel antibiotics, leading to improved hit rates and faster transitions to preclinical and clinical testing. The current review describes two machine-learning techniques, neural networks and decision trees, that have been used to identify experimentally validated antibiotics. We conclude by describing the future directions of this exciting field. © 2015 John Wiley & Sons A/S.

  11. Investigation of the shear bond strength to dentin of universal adhesives applied with two different techniques

    Directory of Open Access Journals (Sweden)

    Elif Yaşa

    2017-09-01

    Full Text Available Objective: The aim of this study was to evaluate the shear bond strength of universal adhesives applied with self-etch and etch&rinse techniques to dentin. Materials and Method: Fourty-eight sound extracted human third molars were used in this study. Occlusal enamel was removed in order to expose the dentinal surface, and the surface was flattened. Specimens were randomly divided into four groups and were sectioned vestibulo-lingually using a diamond disc. The universal adhesives: All Bond Universal (Group 1a and 1b, Gluma Bond Universal (Group 2a and 2b and Single Bond Universal (Group 3a and 3b were applied onto the tooth specimens either with self-etch technique (a or with etch&rinse technique (b according to the manufacturers’ instructions. Clearfil SE Bond (Group 4a; self-etch and Optibond FL (Group 4b; etch&rinse were used as control groups. Then the specimens were restored with a nanohybrid composite resin (Filtek Z550. After thermocycling, shear bond strength test was performed with a universal test machine at a crosshead speed of 0.5 mm/min. Fracture analysis was done under a stereomicroscope (×40 magnification. Data were analyzed using two-way ANOVA and post-hoc Tukey tests. Results: Statistical analysis showed significant differences in shear bond strength values between the universal adhesives (p<0.05. Significantly higher bond strength values were observed in self-etch groups (a in comparison to etch&rinse groups (b (p<0.05. Among all groups, Single Bond Universal showed the greatest shear bond strength values, whereas All Bond Universal showed the lowest shear bond strength values with both application techniques. Conclusion: Dentin bonding strengths of universal adhesives applied with different techniques may vary depending on the adhesive material. For the universal bonding agents tested in this study, the etch&rinse technique negatively affected the bond strength to dentin.

  12. Applied decision analysis and risk evaluation

    International Nuclear Information System (INIS)

    Ferse, W.; Kruber, S.

    1995-01-01

    During 1994 the workgroup 'Applied Decision Analysis and Risk Evaluation; continued the work on the knowledge based decision support system XUMA-GEFA for the evaluation of the hazard potential of contaminated sites. Additionally a new research direction was started which aims at the support of a later stage of the treatment of contaminated sites: The clean-up decision. For the support of decisions arising at this stage, the methods of decision analysis will be used. Computational aids for evaluation and decision support were implemented and a case study at a waste disposal site in Saxony which turns out to be a danger for the surrounding groundwater ressource was initiated. (orig.)

  13. Enhanced nonlinear iterative techniques applied to a nonequilibrium plasma flow

    International Nuclear Information System (INIS)

    Knoll, D.A.

    1998-01-01

    The authors study the application of enhanced nonlinear iterative methods to the steady-state solution of a system of two-dimensional convection-diffusion-reaction partial differential equations that describe the partially ionized plasma flow in the boundary layer of a tokamak fusion reactor. This system of equations is characterized by multiple time and spatial scales and contains highly anisotropic transport coefficients due to a strong imposed magnetic field. They use Newton's method to linearize the nonlinear system of equations resulting from an implicit, finite volume discretization of the governing partial differential equations, on a staggered Cartesian mesh. The resulting linear systems are neither symmetric nor positive definite, and are poorly conditioned. Preconditioned Krylov iterative techniques are employed to solve these linear systems. They investigate both a modified and a matrix-free Newton-Krylov implementation, with the goal of reducing CPU cost associated with the numerical formation of the Jacobian. A combination of a damped iteration, mesh sequencing, and a pseudotransient continuation technique is used to enhance global nonlinear convergence and CPU efficiency. GMRES is employed as the Krylov method with incomplete lower-upper (ILU) factorization preconditioning. The goal is to construct a combination of nonlinear and linear iterative techniques for this complex physical problem that optimizes trade-offs between robustness, CPU time, memory requirements, and code complexity. It is shown that a mesh sequencing implementation provides significant CPU savings for fine grid calculations. Performance comparisons of modified Newton-Krylov and matrix-free Newton-Krylov algorithms will be presented

  14. Applying NISHIJIN historical textile technique for e-Textile.

    Science.gov (United States)

    Kuroda, Tomohiro; Hirano, Kikuo; Sugimura, Kazushige; Adachi, Satoshi; Igarashi, Hidetsugu; Ueshima, Kazuo; Nakamura, Hideo; Nambu, Masayuki; Doi, Takahiro

    2013-01-01

    The e-Textile is the key technology for continuous ambient health monitoring to increase quality of life of patients with chronic diseases. The authors introduce techniques of Japanese historical textile, NISHIJIN, which illustrate almost any pattern from one continuous yarn within the machine weaving process, which is suitable for mixed flow production. Thus, NISHIJIN is suitable for e-Textile production, which requires rapid prototyping and mass production of very complicated patterns. The authors prototyped and evaluated a few vests to take twelve-lead electrocardiogram. The result tells that the prototypes obtains electrocardiogram, which is good enough for diagnosis.

  15. Applying the digital-image-correlation technique to measure the ...

    Indian Academy of Sciences (India)

    shown in figure 4, all of these four columns have identical cross. .... then used to evaluate the lateral displacement, storey drift ratio, and rotation and curvature of .... and analysis for school building retrofitted by steel-framing system, Master's.

  16. Neutron activation: an invaluable technique for teaching applied radiation

    International Nuclear Information System (INIS)

    Trainer, Matthew

    2002-01-01

    This experiment introduces students to the important method of neutron activation. A sample of aluminium was irradiated with neutrons from an isotropic 241 Am-Be source. Using γ-ray spectroscopy, two radionuclide products were identified as 27 Mg and 28 Al. Applying a cadmium cut-off filter and an optimum irradiation time of 45 min, the half-life of 27 Mg was determined as 9.46±0.50 min. The half-life of the 28 Al radionuclide was determined as 2.28±0.10 min using a polythene moderator and an optimum irradiation time of 10 min. (author)

  17. Compressed Sensing Techniques Applied to Ultrasonic Imaging of Cargo Containers

    Directory of Open Access Journals (Sweden)

    Yuri Álvarez López

    2017-01-01

    Full Text Available One of the key issues in the fight against the smuggling of goods has been the development of scanners for cargo inspection. X-ray-based radiographic system scanners are the most developed sensing modality. However, they are costly and use bulky sources that emit hazardous, ionizing radiation. Aiming to improve the probability of threat detection, an ultrasonic-based technique, capable of detecting the footprint of metallic containers or compartments concealed within the metallic structure of the inspected cargo, has been proposed. The system consists of an array of acoustic transceivers that is attached to the metallic structure-under-inspection, creating a guided acoustic Lamb wave. Reflections due to discontinuities are detected in the images, provided by an imaging algorithm. Taking into consideration that the majority of those images are sparse, this contribution analyzes the application of Compressed Sensing (CS techniques in order to reduce the amount of measurements needed, thus achieving faster scanning, without compromising the detection capabilities of the system. A parametric study of the image quality, as a function of the samples needed in spatial and frequency domains, is presented, as well as the dependence on the sampling pattern. For this purpose, realistic cargo inspection scenarios have been simulated.

  18. Microscale and nanoscale strain mapping techniques applied to creep of rocks

    Science.gov (United States)

    Quintanilla-Terminel, Alejandra; Zimmerman, Mark E.; Evans, Brian; Kohlstedt, David L.

    2017-07-01

    Usually several deformation mechanisms interact to accommodate plastic deformation. Quantifying the contribution of each to the total strain is necessary to bridge the gaps from observations of microstructures, to geomechanical descriptions, to extrapolating from laboratory data to field observations. Here, we describe the experimental and computational techniques involved in microscale strain mapping (MSSM), which allows strain produced during high-pressure, high-temperature deformation experiments to be tracked with high resolution. MSSM relies on the analysis of the relative displacement of initially regularly spaced markers after deformation. We present two lithography techniques used to pattern rock substrates at different scales: photolithography and electron-beam lithography. Further, we discuss the challenges of applying the MSSM technique to samples used in high-temperature and high-pressure experiments. We applied the MSSM technique to a study of strain partitioning during creep of Carrara marble and grain boundary sliding in San Carlos olivine, synthetic forsterite, and Solnhofen limestone at a confining pressure, Pc, of 300 MPa and homologous temperatures, T/Tm, of 0.3 to 0.6. The MSSM technique works very well up to temperatures of 700 °C. The experimental developments described here show promising results for higher-temperature applications.

  19. Fourier convergence analysis applied to neutron diffusion Eigenvalue problem

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Noh, Jae Man; Joo, Hyung Kook

    2004-01-01

    Fourier error analysis has been a standard technique for the stability and convergence analysis of linear and nonlinear iterative methods. Though the methods can be applied to Eigenvalue problems too, all the Fourier convergence analyses have been performed only for fixed source problems and a Fourier convergence analysis for Eigenvalue problem has never been reported. Lee et al proposed new 2-D/1-D coupling methods and they showed that the new ones are unconditionally stable while one of the two existing ones is unstable at a small mesh size and that the new ones are better than the existing ones in terms of the convergence rate. In this paper the convergence of method A in reference 4 for the diffusion Eigenvalue problem was analyzed by the Fourier analysis. The Fourier convergence analysis presented in this paper is the first one applied to a neutronics eigenvalue problem to the best of our knowledge

  20. Nuclear reactor vessel surface inspecting technique applying electric resistance probe

    International Nuclear Information System (INIS)

    Yamaguchi, T.; Enami, K.; Yoshioka, M.

    1975-01-01

    A new technique for inspecting the inner surface of the PWR type nuclear reactor vessel by use of an electric resistance probe is introduced, centering on a data processing system. This system is composed of a mini-computer, a system typewriter, an interface unit, a D-A converter and controller, and X-Y recorder and others. Its functions are judging flaws and making flaw detection maps. In order to judge flaws by flaw detection signals, three kinds of flaw judging methods have been developed. In case there is a flaw, its position and depth are calculated and listed on the system typewriter. The flaw detection maps are expressed in four kinds of modes and they are displayed on the X-Y recorder. (auth.)

  1. Neoliberal Optimism: Applying Market Techniques to Global Health.

    Science.gov (United States)

    Mei, Yuyang

    2017-01-01

    Global health and neoliberalism are becoming increasingly intertwined as organizations utilize markets and profit motives to solve the traditional problems of poverty and population health. I use field work conducted over 14 months in a global health technology company to explore how the promise of neoliberalism re-envisions humanitarian efforts. In this company's vaccine refrigerator project, staff members expect their investors and their market to allow them to achieve scale and develop accountability to their users in developing countries. However, the translation of neoliberal techniques to the global health sphere falls short of the ideal, as profits are meager and purchasing power remains with donor organizations. The continued optimism in market principles amidst such a non-ideal market reveals the tenacious ideological commitment to neoliberalism in these global health projects.

  2. Nonequilibrium Green function techniques applied to hot electron quantum transport

    International Nuclear Information System (INIS)

    Jauho, A.P.

    1989-01-01

    During the last few years considerable effort has been devoted to deriving quantum transport equations for semiconductors under extreme conditions (high electric fields, spatial quantization in one or two directions). Here we review the results obtained with nonequilibrium Green function techniques as formulated by Baym and Kadanoff, or by Keldysh. In particular, the following topics will be discussed: (i) Systematic approaches to reduce the transport equation governing the correlation function to a transport equation for the Wigner function; (ii) Approximations reducing the nonmarkovian quantum transport equation to a numerically tractable form, and results for model semiconductors; (iii) Recent progress in extending the formalism to inhomogeneous systems; and (iv) Nonequilibrium screening. In all sections we try to direct the reader's attention to points where the present understanding is (at best) incomplete, and indicate possible lines for future work. (orig.)

  3. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    Science.gov (United States)

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  4. Quantitative Portfolio Optimization Techniques Applied to the Brazilian Stock Market

    Directory of Open Access Journals (Sweden)

    André Alves Portela Santos

    2012-09-01

    Full Text Available In this paper we assess the out-of-sample performance of two alternative quantitative portfolio optimization techniques - mean-variance and minimum variance optimization – and compare their performance with respect to a naive 1/N (or equally-weighted portfolio and also to the market portfolio given by the Ibovespa. We focus on short selling-constrained portfolios and consider alternative estimators for the covariance matrices: sample covariance matrix, RiskMetrics, and three covariance estimators proposed by Ledoit and Wolf (2003, Ledoit and Wolf (2004a and Ledoit and Wolf (2004b. Taking into account alternative portfolio re-balancing frequencies, we compute out-of-sample performance statistics which indicate that the quantitative approaches delivered improved results in terms of lower portfolio volatility and better risk-adjusted returns. Moreover, the use of more sophisticated estimators for the covariance matrix generated optimal portfolios with lower turnover over time.

  5. Considerations in applying on-line IC techniques to BWR's

    International Nuclear Information System (INIS)

    Kaleda, R.J.

    1992-01-01

    Ion-Chromatography (IC) has moved from its traditional role as a laboratory analytical tool to a real time, dynamic, on-line measurement device to follow ppb and sub-ppb concentrations of deleterious impurities in nuclear power plants. Electric Power Research Institute (EPRI), individual utilities, and industry all have played significant roles in effecting the transition. This paper highlights considerations and the evolution in current on-line Ion Chromatography systems. The first applications of on-line techniques were demonstrated by General Electric (GE) under EPRI sponsorship at Rancho Seco (1980), Calvert Cliffs, and McGuire nuclear units. The primary use was for diagnostic purposes. Today the on-line IC applications have been expanded to include process control and routine plant monitoring. Current on-line IC's are innovative in design, promote operational simplicity, are modular for simplified maintenance and repair, and use field-proven components which enhance reliability. Conductivity detection with electronic or chemical suppression and spectrometric detection techniques are intermixed in applications. Remote multi-point sample systems have addressed memory effects. Early applications measured ionic species in the part per billion range. Today reliable part per trillion measurements are common for on-line systems. Current systems are meeting the challenge of EPRI guideline requirements. Today's on-line IC's, with programmed sampling systems, monitor fluid streams throughout a power plant, supplying data that can be trended, stored and retrieved easily. The on-line IC has come of age. Many technical challenges were overcome to achieve today's IC

  6. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  7. Feasibility of Applying Controllable Lubrication Techniques to Reciprocating Machines

    DEFF Research Database (Denmark)

    Pulido, Edgar Estupinan

    of the reciprocating engine, obtained with the help of multibody dynamics (rigid components) and finite elements method (flexible components), and the global system of equations is numerically solved. The analysis of the results was carried out with focus on the behaviour of the journal orbits, maximum fluid film...

  8. Optimization technique applied to interpretation of experimental data and research of constitutive laws

    International Nuclear Information System (INIS)

    Grossette, J.C.

    1982-01-01

    The feasibility of identification technique applied to one dimensional numerical analysis of the split-Hopkinson pressure bar experiment is proven. A general 1-D elastic-plastic-viscoplastic computer program was written down so as to give an adequate solution for elastic-plastic-viscoplastic response of a pressure bar subjected to a general Heaviside step loading function in time which is applied over one end of the bar. Special emphasis is placed on the response of the specimen during the first microseconds where no equilibrium conditions can be stated. During this transient phase discontinuity conditions related to wave propagation are encountered and must be carefully taken into account. Having derived an adequate numerical model, then Pontryagin identification technique has been applied in such a way that the unknowns are physical parameters. The solutions depend mainly on the selection of a class of proper eigen objective functionals (cost functions) which may be combined so as to obtain a convenient numerical objective function. A number of significant questions arising in the choice of parameter adjustment algorithms are discussed. In particular, this technique leads to a two point boundary value problem which has been solved using an iterative gradient like technique usually referred to as a double operator gradient method. This method combines the classical Fletcher-Powell technique and a partial quadratic technique with an automatic parameter step size selection. This method is much more efficient than usual ones. Numerical experimentation with simulated data was performed to test the accuracy and stability of the identification algorithm and to determine the most adequate type and quantity of data for estimation purposes

  9. Schlieren Technique Applied to Magnetohydrodynamic Generator Plasma Torch

    Science.gov (United States)

    Chopra, Nirbhav; Pearcy, Jacob; Jaworski, Michael

    2017-10-01

    Magnetohydrodynamic (MHD) generators are a promising augmentation to current hydrocarbon based combustion schemes for creating electrical power. In recent years, interest in MHD generators has been revitalized due to advances in a number of technologies such as superconducting magnets, solid-state power electronics and materials science as well as changing economics associated with carbon capture, utilization, and sequestration. We use a multi-wavelength schlieren imaging system to evaluate electron density independently of gas density in a plasma torch under conditions relevant to MHD generators. The sensitivity and resolution of the optical system are evaluated alongside the development of an automated analysis and calibration program in Python. Preliminary analysis shows spatial resolutions less than 1mm and measures an electron density of ne = 1 ×1016 cm-3 in an atmospheric microwave torch. Work supported by DOE contract DE-AC02-09CH11466.

  10. Applying Subject Matter Expertise (SME) Elicitation Techniques to TRAC Studies

    Science.gov (United States)

    2014-09-30

    prioritisation, budgeting and resource allocation with multi-criteria decision analysis and decision conferencing ”. English. In: Annals of Operations... electronically . Typically, in responding to survey items, experts are not expected to elaborate beyond providing responses in the format requested in the...between them, however irrelevant to probability Kynn and Ayyub.84 For example, an electronic jamming device might disrupt a cell phone signal at certain

  11. Strategic decision analysis applied to borehole seismology

    International Nuclear Information System (INIS)

    Menke, M.M.; Paulsson, B.N.P.

    1994-01-01

    Strategic Decision Analysis (SDA) is the evolving body of knowledge on how to achieve high quality in the decision that shapes an organization's future. SDA comprises philosophy, process concepts, methodology, and tools for making good decisions. It specifically incorporates many concepts and tools from economic evaluation and risk analysis. Chevron Petroleum Technology Company (CPTC) has applied SDA to evaluate and prioritize a number of its most important and most uncertain R and D projects, including borehole seismology. Before SDA, there were significant issues and concerns about the value to CPTC of continuing to work on borehole seismology. The SDA process created a cross-functional team of experts to structure and evaluate this project. A credible economic model was developed, discrete risks and continuous uncertainties were assessed, and an extensive sensitivity analysis was performed. The results, even applied to a very restricted drilling program for a few years, were good enough to demonstrate the value of continuing the project. This paper explains the SDA philosophy concepts, and process and demonstrates the methodology and tools using the borehole seismology project example. SDA is useful in the upstream industry not just in the R and D/technology decisions, but also in major exploration and production decisions. Since a major challenge for upstream companies today is to create and realize value, the SDA approach should have a very broad applicability

  12. Acoustic Emission Technique Applied in Textiles Mechanical Characterization

    Directory of Open Access Journals (Sweden)

    Rios-Soberanis Carlos Rolando

    2017-01-01

    Full Text Available The common textile architecture/geometry are woven, braided, knitted, stitch boded, and Z-pinned. Fibres in textile form exhibit good out-of-plane properties and good fatigue and impact resistance, additionally, they have better dimensional stability and conformability. Besides the nature of the textile, the architecture has a great role in the mechanical behaviour and mechanisms of damage in textiles, therefore damage mechanisms and mechanical performance in structural applications textiles have been a major concern. Mechanical damage occurs to a large extent during the service lifetime consequently it is vital to understand the material mechanical behaviour by identifying its mechanisms of failure such as onset of damage, crack generation and propagation. In this work, textiles of different architecture were used to manufacture epoxy based composites in order to study failure events under tensile load by using acoustic emission technique which is a powerful characterization tool due to its link between AE data and fracture mechanics, which makes this relation a very useful from the engineering point of view.

  13. A Kalman filter technique applied for medical image reconstruction

    International Nuclear Information System (INIS)

    Goliaei, S.; Ghorshi, S.; Manzuri, M. T.; Mortazavi, M.

    2011-01-01

    Medical images contain information about vital organic tissues inside of human body and are widely used for diagnoses of disease or for surgical purposes. Image reconstruction is essential for medical images for some applications such as suppression of noise or de-blurring the image in order to provide images with better quality and contrast. Due to vital rule of image reconstruction in medical sciences the corresponding algorithms with better efficiency and higher speed is desirable. Most algorithms in image reconstruction are operated on frequency domain such as the most popular one known as filtered back projection. In this paper we introduce a Kalman filter technique which is operated in time domain for medical image reconstruction. Results indicated that as the number of projection increases in both normal collected ray sum and the collected ray sum corrupted by noise the quality of reconstructed image becomes better in terms of contract and transparency. It is also seen that as the number of projection increases the error index decreases.

  14. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  15. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  16. Advanced Gradient Based Optimization Techniques Applied on Sheet Metal Forming

    International Nuclear Information System (INIS)

    Endelt, Benny; Nielsen, Karl Brian

    2005-01-01

    The computational-costs for finite element simulations of general sheet metal forming processes are considerable, especially measured in time. In combination with optimization, the performance of the optimization algorithm is crucial for the overall performance of the system, i.e. the optimization algorithm should gain as much information about the system in each iteration as possible. Least-square formulation of the object function is widely applied for solution of inverse problems, due to the superior performance of this formulation.In this work focus will be on small problems which are defined as problems with less than 1000 design parameters; as the majority of real life optimization and inverse problems, represented in literature, can be characterized as small problems, typically with less than 20 design parameters.We will show that the least square formulation is well suited for two classes of inverse problems; identification of constitutive parameters and process optimization.The scalability and robustness of the approach are illustrated through a number of process optimizations and inverse material characterization problems; tube hydro forming, two step hydro forming, flexible aluminum tubes, inverse identification of material parameters

  17. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    nutritional status and metabolic phenotype. We want to understand how metabolomic spectra can be analysed using functional data analysis to detect the in uence of dierent factors on specic metabolites. These factors can include, for example, gender, diet culture or dietary intervention. In Paper I we apply...... representation of each spectrum. Subset selection of wavelet coecients generates the input to mixed models. Mixed-model methodology enables us to take the study design into account while modelling covariates. Bootstrap-based inference preserves the correlation structure between curves and enables the estimation...

  18. Applying data mining techniques to improve diagnosis in neonatal jaundice

    Directory of Open Access Journals (Sweden)

    Ferreira Duarte

    2012-12-01

    Full Text Available Abstract Background Hyperbilirubinemia is emerging as an increasingly common problem in newborns due to a decreasing hospital length of stay after birth. Jaundice is the most common disease of the newborn and although being benign in most cases it can lead to severe neurological consequences if poorly evaluated. In different areas of medicine, data mining has contributed to improve the results obtained with other methodologies. Hence, the aim of this study was to improve the diagnosis of neonatal jaundice with the application of data mining techniques. Methods This study followed the different phases of the Cross Industry Standard Process for Data Mining model as its methodology. This observational study was performed at the Obstetrics Department of a central hospital (Centro Hospitalar Tâmega e Sousa – EPE, from February to March of 2011. A total of 227 healthy newborn infants with 35 or more weeks of gestation were enrolled in the study. Over 70 variables were collected and analyzed. Also, transcutaneous bilirubin levels were measured from birth to hospital discharge with maximum time intervals of 8 hours between measurements, using a noninvasive bilirubinometer. Different attribute subsets were used to train and test classification models using algorithms included in Weka data mining software, such as decision trees (J48 and neural networks (multilayer perceptron. The accuracy results were compared with the traditional methods for prediction of hyperbilirubinemia. Results The application of different classification algorithms to the collected data allowed predicting subsequent hyperbilirubinemia with high accuracy. In particular, at 24 hours of life of newborns, the accuracy for the prediction of hyperbilirubinemia was 89%. The best results were obtained using the following algorithms: naive Bayes, multilayer perceptron and simple logistic. Conclusions The findings of our study sustain that, new approaches, such as data mining, may support

  19. Applying perceptual and adaptive learning techniques for teaching introductory histopathology

    Directory of Open Access Journals (Sweden)

    Sally Krasne

    2013-01-01

    Full Text Available Background: Medical students are expected to master the ability to interpret histopathologic images, a difficult and time-consuming process. A major problem is the issue of transferring information learned from one example of a particular pathology to a new example. Recent advances in cognitive science have identified new approaches to address this problem. Methods: We adapted a new approach for enhancing pattern recognition of basic pathologic processes in skin histopathology images that utilizes perceptual learning techniques, allowing learners to see relevant structure in novel cases along with adaptive learning algorithms that space and sequence different categories (e.g. diagnoses that appear during a learning session based on each learner′s accuracy and response time (RT. We developed a perceptual and adaptive learning module (PALM that utilized 261 unique images of cell injury, inflammation, neoplasia, or normal histology at low and high magnification. Accuracy and RT were tracked and integrated into a "Score" that reflected students rapid recognition of the pathologies and pre- and post-tests were given to assess the effectiveness. Results: Accuracy, RT and Scores significantly improved from the pre- to post-test with Scores showing much greater improvement than accuracy alone. Delayed post-tests with previously unseen cases, given after 6-7 weeks, showed a decline in accuracy relative to the post-test for 1 st -year students, but not significantly so for 2 nd -year students. However, the delayed post-test scores maintained a significant and large improvement relative to those of the pre-test for both 1 st and 2 nd year students suggesting good retention of pattern recognition. Student evaluations were very favorable. Conclusion: A web-based learning module based on the principles of cognitive science showed an evidence for improved recognition of histopathology patterns by medical students.

  20. Thermal analysis applied to irradiated propolis

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; Mastro, N.L. del E-mail: nelida@usp.br

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were {sup 60}Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600 deg. C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  1. Applying traditional signal processing techniques to social media exploitation for situational understanding

    Science.gov (United States)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  2. Applied spectrophotometry: analysis of a biochemical mixture.

    Science.gov (United States)

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. Copyright © 2013 Wiley Periodicals, Inc.

  3. Applied modal analysis of wind turbine blades

    Energy Technology Data Exchange (ETDEWEB)

    Broen Pedersen, H.; Dahl Kristensen, O.J.

    2003-02-01

    In this project modal analysis has been used to determine the natural frequencies, damping and the mode shapes for wind turbine blades. Different methods to measure the position and adjust the direction of the measuring points are discussed. Different equipment for mounting the accelerometers are investigated and the most suitable are chosen. Different excitation techniques are tried during experimental campaigns. After a discussion the pendulum hammer were chosen, and a new improved hammer was manufactured. Some measurement errors are investigated. The ability to repeat the measured results is investigated by repeated measurement on the same wind turbine blade. Furthermore the flexibility of the test set-up is investigated, by use of accelerometers mounted on the flexible adapter plate during the measurement campaign. One experimental campaign investigated the results obtained from a loaded and unloaded wind turbine blade. During this campaign the modal analysis are performed on a blade mounted in a horizontal and a vertical position respectively. Finally the results obtained from modal analysis carried out on a wind turbine blade are compared with results obtained from the Stig Oeyes blade{sub E}V1 program. (au)

  4. Tissue Microarray Analysis Applied to Bone Diagenesis.

    Science.gov (United States)

    Mello, Rafael Barrios; Silva, Maria Regina Regis; Alves, Maria Teresa Seixas; Evison, Martin Paul; Guimarães, Marco Aurelio; Francisco, Rafaella Arrabaca; Astolphi, Rafael Dias; Iwamura, Edna Sadayo Miazato

    2017-01-04

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens. Standard hematoxylin and eosin, periodic acid-Schiff and silver methenamine, and picrosirius red staining, and CD31 and CD34 immunohistochemistry were applied to TMA sections. Osteocyte and osteocyte lacuna counts, percent bone matrix loss, and fungal spheroid element counts could be measured and collagen fibre bundles observed in all specimens. Decalcification with 7% nitric acid proceeded more rapidly than with 0.5 M EDTA and may offer better preservation of histological and cellular structure. No endothelial cells could be detected using CD31 and CD34 immunohistochemistry. Correlation between osteocytes per lacuna and age at death may reflect reported age-related responses to microdamage. Methodological limitations and caveats, and results of the TMA analysis of post mortem diagenesis in bone are discussed, and implications for DNA survival and recovery considered.

  5. Gradient pattern analysis applied to galaxy morphology

    Science.gov (United States)

    Rosa, R. R.; de Carvalho, R. R.; Sautter, R. A.; Barchi, P. H.; Stalder, D. H.; Moura, T. C.; Rembold, S. B.; Morell, D. R. F.; Ferreira, N. C.

    2018-06-01

    Gradient pattern analysis (GPA) is a well-established technique for measuring gradient bilateral asymmetries of a square numerical lattice. This paper introduces an improved version of GPA designed for galaxy morphometry. We show the performance of the new method on a selected sample of 54 896 objects from the SDSS-DR7 in common with Galaxy Zoo 1 catalogue. The results suggest that the second gradient moment, G2, has the potential to dramatically improve over more conventional morphometric parameters. It separates early- from late-type galaxies better (˜ 90 per cent) than the CAS system (C˜ 79 per cent, A˜ 50 per cent, S˜ 43 per cent) and a benchmark test shows that it is applicable to hundreds of thousands of galaxies using typical processing systems.

  6. Energy analysis applied to uranium resource estimation

    International Nuclear Information System (INIS)

    Mortimer, N.D.

    1980-01-01

    It is pointed out that fuel prices and ore costs are interdependent, and that in estimating ore costs (involving the cost of fuels used to mine and process the uranium) it is necessary to take into account the total use of energy by the entire fuel system, through the technique of energy analysis. The subject is discussed, and illustrated with diagrams, under the following heads: estimate of how total workable resources would depend on production costs; sensitivity of nuclear electricity prices to ore costs; variation of net energy requirement with ore grade for a typical PWR reactor design; variation of average fundamental cost of nuclear electricity with ore grade; variation of cumulative uranium resources with current maximum ore costs. (U.K.)

  7. Applied linear algebra and matrix analysis

    CERN Document Server

    Shores, Thomas S

    2018-01-01

    In its second edition, this textbook offers a fresh approach to matrix and linear algebra. Its blend of theory, computational exercises, and analytical writing projects is designed to highlight the interplay between these aspects of an application. This approach places special emphasis on linear algebra as an experimental science that provides tools for solving concrete problems. The second edition’s revised text discusses applications of linear algebra like graph theory and network modeling methods used in Google’s PageRank algorithm. Other new materials include modeling examples of diffusive processes, linear programming, image processing, digital signal processing, and Fourier analysis. These topics are woven into the core material of Gaussian elimination and other matrix operations; eigenvalues, eigenvectors, and discrete dynamical systems; and the geometrical aspects of vector spaces. Intended for a one-semester undergraduate course without a strict calculus prerequisite, Applied Linear Algebra and M...

  8. Using wavelet denoising and mathematical morphology in the segmentation technique applied to blood cells images.

    Science.gov (United States)

    Boix, Macarena; Cantó, Begoña

    2013-04-01

    Accurate image segmentation is used in medical diagnosis since this technique is a noninvasive pre-processing step for biomedical treatment. In this work we present an efficient segmentation method for medical image analysis. In particular, with this method blood cells can be segmented. For that, we combine the wavelet transform with morphological operations. Moreover, the wavelet thresholding technique is used to eliminate the noise and prepare the image for suitable segmentation. In wavelet denoising we determine the best wavelet that shows a segmentation with the largest area in the cell. We study different wavelet families and we conclude that the wavelet db1 is the best and it can serve for posterior works on blood pathologies. The proposed method generates goods results when it is applied on several images. Finally, the proposed algorithm made in MatLab environment is verified for a selected blood cells.

  9. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  10. Epithermal neutron activation analysis in applied microbiology

    International Nuclear Information System (INIS)

    Marina Frontasyeva

    2012-01-01

    Some results from applying epithermal neutron activation analysis at FLNP JINR, Dubna, Russia, in medical biotechnology, environmental biotechnology and industrial biotechnology are reviewed. In the biomedical experiments biomass from the blue-green alga Spirulina platensis (S. platensis) has been used as a matrix for the development of pharmaceutical substances containing such essential trace elements as selenium, chromium and iodine. The feasibility of target-oriented introduction of these elements into S. platensis biocomplexes retaining its protein composition and natural beneficial properties was shown. The absorption of mercury on growth dynamics of S. platensis and other bacterial strains was observed. Detoxification of Cr and Hg by Arthrobacter globiformis 151B was demonstrated. Microbial synthesis of technologically important silver nanoparticles by the novel actinomycete strain Streptomyces glaucus 71 MD and blue-green alga S. platensis were characterized by a combined use of transmission electron microscopy, scanning electron microscopy and energy-dispersive analysis of X-rays. It was established that the tested actinomycete S. glaucus 71 MD produces silver nanoparticles extracellularly when acted upon by the silver nitrate solution, which offers a great advantage over an intracellular process of synthesis from the point of view of applications. The synthesis of silver nanoparticles by S. platensis proceeded differently under the short-term and long-term silver action. (author)

  11. Analysis of the interaction between experimental and applied behavior analysis.

    Science.gov (United States)

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis. © Society for the Experimental Analysis of Behavior.

  12. Case study: how to apply data mining techniques in a healthcare data warehouse.

    Science.gov (United States)

    Silver, M; Sakata, T; Su, H C; Herman, C; Dolins, S B; O'Shea, M J

    2001-01-01

    Healthcare provider organizations are faced with a rising number of financial pressures. Both administrators and physicians need help analyzing large numbers of clinical and financial data when making decisions. To assist them, Rush-Presbyterian-St. Luke's Medical Center and Hitachi America, Ltd. (HAL), Inc., have partnered to build an enterprise data warehouse and perform a series of case study analyses. This article focuses on one analysis, which was performed by a team of physicians and computer science researchers, using a commercially available on-line analytical processing (OLAP) tool in conjunction with proprietary data mining techniques developed by HAL researchers. The initial objective of the analysis was to discover how to use data mining techniques to make business decisions that can influence cost, revenue, and operational efficiency while maintaining a high level of care. Another objective was to understand how to apply these techniques appropriately and to find a repeatable method for analyzing data and finding business insights. The process used to identify opportunities and effect changes is described.

  13. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  14. New technique of in-situ soil-moisture sampling for environmental isotope analysis applied at Pilat sand dune near Bordeaux. HETP modelling of bomb tritium propagation in the unsaturated and saturated zones

    International Nuclear Information System (INIS)

    Thoma, G.; Esser, N.; Sonntag, C.; Weiss, W.; Rudolph, J.; Leveque, P.

    1979-01-01

    A new soil-air suction method with soil-water vapour adsorption by a 4-A molecular sieve provides soil-moisture samples from various depths for environmental isotope analysis and yields soil temperature profiles. A field tritium tracer experiment shows that this in-situ sampling method has an isotope profile resolution of about 5-10cm only. Application of this method in the Pilat sand dune (Bordeaux/France) yielded deuterium and tritium profiles down to 25m depth. Bomb tritium measurements of monthly lysimeter percolate samples available since 1961 show that the tritium response has a mean delay of five months in the case of a sand lysimeter and of 2.5 years for a loess loam lysimeter. A simple HETP model simulates the layered downward movement of soil water and the longitudinal dispersion in the lysimeters. Field capacity and evapotranspiration taken as open parameters yield tritium concentration values of the lysimeters' percolate which agree well with the experimental results. Based on local meteorological data the HETP model applied to tritium tracer experiments in the unsaturated zone yields in addition an individual prediction of the momentary tracer position and of the soil-moisture distribution. This prediction can be checked experimentally at selected intervals by coring. (author)

  15. Applied research on air pollution using nuclear-related analytical techniques

    International Nuclear Information System (INIS)

    1994-01-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which will run from 1992-1996, and will build upon the experience gained by the Agency from the laboratory support that it has been providing for several years to BAPMoN - the Background Air Pollution Monitoring Network programme organized under the auspices of the World Meterological Organization. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XFR, and PIXE for the analysis of toxic and other trace elements in suspended particulate matter (including air filter samples), rainwater and fog-water samples, and in biological indicators of air pollution (e.g. lichens and mosses). The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for practically-oriented research and monitoring studies on air pollution ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural areas). This document reports the discussions held during the first Research Co-ordination Meeting (RCM) for the CRP which took place at the IAEA Headquarters in Vienna. Refs, figs and tabs

  16. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  17. Colilert® applied to food analysis

    Directory of Open Access Journals (Sweden)

    Maria José Rodrigues

    2014-06-01

    Full Text Available Colilert® (IDEXX was originally developed for the simultaneous enumeration of coliforms and E. coli in water samples and has been used for the quality control routine of drinking, swimming pools, fresh, coastal and waste waters (Grossi et al., 2013. The Colilert® culture medium contains the indicator nutrient 4-Methylumbelliferyl-β-D-Glucuronide (MUG. MUG acts as a substrate for the E. coli enzyme β-glucuronidase, from which a fluorescent compound is produced. A positive MUG result produces fluorescence when viewed under an ultraviolet lamp. If the test fluorescence is equal to or greater than that of the control, the presence of E. coli has been confirmed (Lopez-Roldan et al., 2013. The present work aimed to apply Colilert® to the enumeration of E. coli in different foods, through the comparison of results against the reference method (ISO 16649-2, 2001 for E. coli food analysis. The study was divided in two stages. During the first stage ten different types of foods were analyzed with Colilert®, these included pastry, raw meat, ready to eat meals, yogurt, raw seabream and salmon, and cooked shrimp. From these it were approved the following: pastry with custard; raw minced pork; soup "caldo-verde"; raw vegetable salad (lettuce and carrots and solid yogurt. The approved foods presented a better insertion in the tray, the colour of the wells was lighter and the UV reading was easier. In the second stage the foods were artificially contaminated with 2 log/g of E. coli (ATCC 25922 and analyzed. Colilert® proved to be an accurate method and the counts were similar to the ones obtained with the reference method. In the present study, the Colilert® method did not reveal neither false-positive or false-negative results, however sometimes the results were difficult to read due to the presence of green fluorescence in some wells. Generally Colilert® was an easy and rapid method, but less objective and more expensive than the reference method.

  18. Validation and qualification of surface-applied fibre optic strain sensors using application-independent optical techniques

    International Nuclear Information System (INIS)

    Schukar, Vivien G; Kadoke, Daniel; Kusche, Nadine; Münzenberger, Sven; Gründer, Klaus-Peter; Habel, Wolfgang R

    2012-01-01

    Surface-applied fibre optic strain sensors were investigated using a unique validation facility equipped with application-independent optical reference systems. First, different adhesives for the sensor's application were analysed regarding their material properties. Measurements resulting from conventional measurement techniques, such as thermo-mechanical analysis and dynamic mechanical analysis, were compared with measurements resulting from digital image correlation, which has the advantage of being a non-contact technique. Second, fibre optic strain sensors were applied to test specimens with the selected adhesives. Their strain-transfer mechanism was analysed in comparison with conventional strain gauges. Relative movements between the applied sensor and the test specimen were visualized easily using optical reference methods, digital image correlation and electronic speckle pattern interferometry. Conventional strain gauges showed limited opportunities for an objective strain-transfer analysis because they are also affected by application conditions. (paper)

  19. MULTIVARIATE TECHNIQUES APPLIED TO EVALUATION OF LIGNOCELLULOSIC RESIDUES FOR BIOENERGY PRODUCTION

    Directory of Open Access Journals (Sweden)

    Thiago de Paula Protásio

    2013-12-01

    Full Text Available http://dx.doi.org/10.5902/1980509812361The evaluation of lignocellulosic wastes for bioenergy production demands to consider several characteristicsand properties that may be correlated. This fact demands the use of various multivariate analysis techniquesthat allow the evaluation of relevant energetic factors. This work aimed to apply cluster analysis and principalcomponents analyses for the selection and evaluation of lignocellulosic wastes for bioenergy production.8 types of residual biomass were used, whose the elemental components (C, H, O, N, S content, lignin, totalextractives and ashes contents, basic density and higher and lower heating values were determined. Bothmultivariate techniques applied for evaluation and selection of lignocellulosic wastes were efficient andsimilarities were observed between the biomass groups formed by them. Through the interpretation of thefirst principal component obtained, it was possible to create a global development index for the evaluationof the viability of energetic uses of biomass. The interpretation of the second principal component alloweda contrast between nitrogen and sulfur contents with oxygen content.

  20. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  1. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  2. Functional analysis in modern applied mathematics

    CERN Document Server

    Curtain, Ruth F

    1977-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  3. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y.

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  4. Chemical analysis by nuclear techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system.

  5. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  6. Applying BI Techniques To Improve Decision Making And Provide Knowledge Based Management

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2015-07-01

    Full Text Available The paper focuses on BI techniques and especially data mining algorithms that can support and improve the decision making process, with applications within the financial sector. We consider the data mining techniques to be more efficient and thus we applied several techniques, supervised and unsupervised learning algorithms The case study in which these algorithms have been implemented regards the activity of a banking institution, with focus on the management of lending activities.

  7. Introduction: Conversation Analysis in Applied Linguistics

    Science.gov (United States)

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  8. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  9. Tissue Microarray Analysis Applied to Bone Diagenesis

    OpenAIRE

    Barrios Mello, Rafael; Regis Silva, Maria Regina; Seixas Alves, Maria Teresa; Evison, Martin; Guimarães, Marco Aurélio; Francisco, Rafaella Arrabaça; Dias Astolphi, Rafael; Miazato Iwamura, Edna Sadayo

    2017-01-01

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens....

  10. Applied modal analysis of wind turbine blades

    DEFF Research Database (Denmark)

    Pedersen, H.B.; Kristensen, O.J.D.

    2003-01-01

    In this project modal analysis has been used to determine the natural frequencies, damping and the mode shapes for wind turbine blades. Different methods to measure the position and adjust the direction of the measuring points are discussed. Differentequipment for mounting the accelerometers...... is investigated by repeated measurement on the same wind turbine blade. Furthermore the flexibility of the test set-up is investigated, by use ofaccelerometers mounted on the flexible adapter plate during the measurement campaign. One experimental campaign investigated the results obtained from a loaded...... and unloaded wind turbine blade. During this campaign the modal analysis are performed on ablade mounted in a horizontal and a vertical position respectively. Finally the results obtained from modal analysis carried out on a wind turbine blade are compared with results obtained from the Stig Øyes blade_EV1...

  11. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  12. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  13. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  14. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    Science.gov (United States)

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  15. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  16. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  17. Current status of neutron activation analysis and applied nuclear chemistry

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1990-01-01

    A review of recent scientometric studies of citations and publication data shows the present state of NAA and applied nuclear chemistry as compared to other analytical techniques. (author) 9 refs.; 7 tabs

  18. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    International Nuclear Information System (INIS)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-01-01

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  19. Reliability analysis applied to structural tests

    Science.gov (United States)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  20. Dielectric spectroscopy technique applied to study the behaviour of irradiated polymer

    International Nuclear Information System (INIS)

    Saoud, R.; Soualmia, A.; Guerbi, C.A.; Benrekaa, N.

    2006-01-01

    Relaxation spectroscopy provides an excellent method for the study of motional processes in materials and has been widely applied to macromolecules and polymers. The technique is potentially of most interest when applied to irradiated systems. Application to the study of the structure beam-irradiated Teflon is thus an outstanding opportunity for the dielectric relaxation technique, particularly as this material exhibits clamping problems when subjected to dynamic mechanical relaxation studies. A very wide frequency range is necessary to resolve dipolar effects. In this paper, we discuss some significant results about the behavior and the modification of the structure of Teflon submitted to weak energy radiations

  1. Nuclear techniques (PIXE and RBS) applied to analysis of pre hispanic metals of the Templo Mayor at Tenochtitlan; Tecnicas nucleares (PIXE y RBS) aplicadas al analisis de metales prehispanicos del Templo Mayor de Tenochtitlan

    Energy Technology Data Exchange (ETDEWEB)

    Mendez U, I.; Tenorio, D.; Galvan, J.L. [Instituto Nacional de Investigaciones Nucleares, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2000-07-01

    This work has the objective of determining by means of the utilization of nuclear techniques (PIXE and RBS) the composition and the alloy type of diverse aztec ornaments corresponding to Post classic period, they were manufactured principally with copper and gold such as bells, beads and disks; all they belonging at 9 oblations of Templo Mayor of Tenochtitlan. It is presented here briefly the historical and archaeological antecedents of the devices as well as the analytical methods for conclude with the results obtained. (Author)

  2. Thermal transient analysis applied to horizontal wells

    Energy Technology Data Exchange (ETDEWEB)

    Duong, A.N. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[ConocoPhillips Canada Resources Corp., Calgary, AB (Canada)

    2008-10-15

    Steam assisted gravity drainage (SAGD) is a thermal recovery process used to recover bitumen and heavy oil. This paper presented a newly developed model to estimate cooling time and formation thermal diffusivity by using a thermal transient analysis along the horizontal wellbore under a steam heating process. This radial conduction heating model provides information on the heat influx distribution along a horizontal wellbore or elongated steam chamber, and is therefore important for determining the effectiveness of the heating process in the start-up phase in SAGD. Net heat flux estimation in the target formation during start-up can be difficult to measure because of uncertainties regarding heat loss in the vertical section; steam quality along the horizontal segment; distribution of steam along the wellbore; operational conditions; and additional effects of convection heating. The newly presented model can be considered analogous to pressure transient analysis of a buildup after a constant pressure drawdown. The model is based on an assumption of an infinite-acting system. This paper also proposed a new concept of a heating ring to measure the heat storage in the heated bitumen at the time of testing. Field observations were used to demonstrate how the model can be used to save heat energy, conserve steam and enhance bitumen recovery. 18 refs., 14 figs., 2 appendices.

  3. Database 'catalogue of techniques applied to materials and products of nuclear engineering'

    International Nuclear Information System (INIS)

    Lebedeva, E.E.; Golovanov, V.N.; Podkopayeva, I.A.; Temnoyeva, T.A.

    2002-01-01

    The database 'Catalogue of techniques applied to materials and products of nuclear engineering' (IS MERI) was developed to provide informational support for SSC RF RIAR and other enterprises in scientific investigations. This database contains information on the techniques used at RF Minatom enterprises for reactor material properties investigation. The main purpose of this system consists in the assessment of the current status of the reactor material science experimental base for the further planning of experimental activities and methodical support improvement. (author)

  4. Photometric analysis applied in determining facial type

    Directory of Open Access Journals (Sweden)

    Luciana Flaquer Martins

    2012-10-01

    Full Text Available INTRODUCTION: In orthodontics, determining the facial type is a key element in the prescription of a correct diagnosis. In the early days of our specialty, observation and measurement of craniofacial structures were done directly on the face, in photographs or plaster casts. With the development of radiographic methods, cephalometric analysis replaced the direct facial analysis. Seeking to validate the analysis of facial soft tissues, this work compares two different methods used to determining the facial types, the anthropometric and the cephalometric methods. METHODS: The sample consisted of sixty-four Brazilian individuals, adults, Caucasian, of both genders, who agreed to participate in this research. All individuals had lateral cephalograms and facial frontal photographs. The facial types were determined by the Vert Index (cephalometric and the Facial Index (photographs. RESULTS: The agreement analysis (Kappa, made for both types of analysis, found an agreement of 76.5%. CONCLUSIONS: We concluded that the Facial Index can be used as an adjunct to orthodontic diagnosis, or as an alternative method for pre-selection of a sample, avoiding that research subjects have to undergo unnecessary tests.INTRODUÇÃO: em Ortodontia, a determinação do tipo facial é um elemento-chave na prescrição de um diagnóstico correto. Nos primórdios de nossa especialidade, a observação e a medição das estruturas craniofaciais eram feitas diretamente na face, em fotografias ou em modelos de gesso. Com o desenvolvimento dos métodos radiográficos, a análise cefalométrica foi substituindo a análise facial direta. Visando legitimar o estudo dos tecidos moles faciais, esse trabalho comparou a determinação do tipo facial pelos métodos antropométrico e cefalométrico. MÉTODOS: a amostra constou de sessenta e quatro indivíduos brasileiros, adultos, leucodermas, de ambos os sexos, que aceitaram participar da pesquisa. De todos os indivíduos da amostra

  5. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Bak, Sung Ryel; Park, Yong Chul; Kim, Young Ki; Chung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun

    2000-05-01

    This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques.

  6. Applied research and development of neutron activation analysis

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Bak, Sung Ryel; Park, Yong Chul; Kim, Young Ki; Chung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun

    2000-05-01

    This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques

  7. PIXE analysis applied to characterized water samples

    International Nuclear Information System (INIS)

    Santos, Maristela S.; Carneiro, Luana Gomes; Medeiros, Geiza; Sampaio, Camilla; Martorell, Ana Beatriz Targino; Gouvea, Stella; Cunha, Kenya Moore Dias da

    2011-01-01

    Araxa, in Brazil, is a naturally high background area located in the State of Minas Gerais with a population of about 93 672 people. Araxa is historical city famous for its mineral water sources and mud from Termas de Araxa spa, which have been used for therapeutic, and recreation purposes. Other important aspect of economy of the city are mining and metallurgic industries. In the Araxa area is located the largest deposit of pyrochlore, a niobium mineral, and also a deposit of apatite, a phosphate mineral both containing Th and U associated to crystal lattice. The minerals are obtained from open pit mines, the minerals are processed in industrial also located in city of Araxa, these plants process the pyrochlore and apatite to obtain the Fe-Nb alloy and the concentrate of phosphate, respectively. Studies were developed in this area to assessment the occupational risk of the workers due to exposure to dust particles during the routine working, however very few studies evaluated the water contamination outside the mines in order to determine the metal (stables elements) concentrations in water and also the concentrations of the radionuclides in water. This paper presents the previous results of a study to identify and determine the concentrations of metals (stables elements) and radionuclides in river around the city. The water from these rivers is used as drinking water and irrigation water. The water samples were collected in different rivers around the Araxa city and the samples were analyzed using PIXE technique. A proton beam of 2 MeV obtained from the van de Graaff electrostatic accelerator was used to induce the characteristic X-rays. S, K, Ca, Cr, Mn, Fe, Ni, Zn, Ba, Pb and U were identified in the mass spectrum of the samples. The elemental mass concentrations were compared using a non-parametric statistical test. The results of the statistical test showed that the elemental mass concentrations did not present the same distribution. These results indicated

  8. Toward applied behavior analysis of life aloft

    Science.gov (United States)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  9. Probabilistic safety analysis applied to RBMK reactors

    International Nuclear Information System (INIS)

    Gerez Martin, L.; Fernandez Ramos, P.

    1995-01-01

    The project financed by the European Union ''Revision of RBMK Reactor Safety was divided into nine Topic Groups dealing with different aspects of safety. The area covered by Topic Group 9 was Probabilistic Safety Analysis. TG9 will have touched on some of the problems discussed by other groups, although in terms of the systematic quantification of the impact of design characteristics and RBMK reactor operating practices on the risk of core damage. On account of the reduced time scale and the resources available for the project, the analysis was made using a simplified method based on the results of PSAs conducted in Western countries and on the judgement of the group members. The simplifies method is based on the concepts of Qualification, Redundancy and Automatic Actuation of the systems considered. PSA experience shows that systems complying with the above-mentioned concepts have a failure probability of 1.0E-3 when redundancy is simple, ie two similar equipment items capable of carrying out the same function. In general terms, this value can be considered to be dominated by potential common cause failures. The value considered above changes according to factors that have a positive effect upon it, such as an additional redundancy with a different equipment item (eg a turbo pumps and a motor pump), individual trains with good separations, etc, or a negative effect, such as the absence of suitable periodical tests, the need for operators to perform manual operations, etc. Similarly, possible actions required by the operator during accident sequences are assigned failure probability values between 1 and 1.0E-4, according to the complexity of the action (including local actions to be performed outside the control room) and the time available

  10. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  11. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  12. Strategies and techniques of communication and public relations applied to non-profit sector

    Directory of Open Access Journals (Sweden)

    Ioana – Julieta Josan

    2010-05-01

    Full Text Available The aim of this paper is to summarize the strategies and techniques of communication and public relations applied to non-profit sector.The approach of the paper is to identify the most appropriate strategies and techniques that non-profit sector can use to accomplish its objectives, to highlight specific differences between the strategies and techniques of the profit and non-profit sectors and to identify potential communication and public relations actions in order to increase visibility among target audience, create brand awareness and to change into positive brand sentiment the target perception about the non-profit sector.

  13. Digital photoelastic analysis applied to implant dentistry

    Science.gov (United States)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  14. Element selective detection of molecular species applying chromatographic techniques and diode laser atomic absorption spectrometry.

    Science.gov (United States)

    Kunze, K; Zybin, A; Koch, J; Franzke, J; Miclea, M; Niemax, K

    2004-12-01

    Tunable diode laser atomic absorption spectroscopy (DLAAS) combined with separation techniques and atomization in plasmas and flames is presented as a powerful method for analysis of molecular species. The analytical figures of merit of the technique are demonstrated by the measurement of Cr(VI) and Mn compounds, as well as molecular species including halogen atoms, hydrogen, carbon and sulfur.

  15. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  16. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  17. English Language Teachers' Perceptions on Knowing and Applying Contemporary Language Teaching Techniques

    Science.gov (United States)

    Sucuoglu, Esen

    2017-01-01

    The aim of this study is to determine the perceptions of English language teachers teaching at a preparatory school in relation to their knowing and applying contemporary language teaching techniques in their lessons. An investigation was conducted of 21 English language teachers at a preparatory school in North Cyprus. The SPSS statistical…

  18. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  19. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  20. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    Science.gov (United States)

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Hybrid chemical and nondestructive-analysis technique

    International Nuclear Information System (INIS)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  2. Learning mediastinoscopy: the need for education, experience and modern techniques--interdependency of the applied technique and surgeon's training level.

    Science.gov (United States)

    Walles, Thorsten; Friedel, Godehard; Stegherr, Tobias; Steger, Volker

    2013-04-01

    Mediastinoscopy represents the gold standard for invasive mediastinal staging. While learning and teaching the surgical technique are challenging due to the limited accessibility of the operation field, both benefited from the implementation of video-assisted techniques. However, it has not been established yet whether video-assisted mediastinoscopy improves the mediastinal staging in itself. Retrospective single-centre cohort analysis of 657 mediastinoscopies performed at a specialized tertiary care thoracic surgery unit from 1994 to 2006. The number of specimens obtained per procedure and per lymph node station (2, 4, 7, 8 for mediastinoscopy and 2-9 for open lymphadenectomy), the number of lymph node stations examined, sensitivity and negative predictive value with a focus on the technique employed (video-assisted vs standard technique) and the surgeon's experience were calculated. Overall sensitivity was 60%, accuracy was 90% and negative predictive value 88%. With the conventional technique, experience alone improved sensitivity from 49 to 57% and it was predominant at the paratracheal right region (from 62 to 82%). But with the video-assisted technique, experienced surgeons rose sensitivity from 57 to 79% in contrast to inexperienced surgeons who lowered sensitivity from 49 to 33%. We found significant differences concerning (i) the total number of specimens taken, (ii) the amount of lymph node stations examined, (iii) the number of specimens taken per lymph node station and (iv) true positive mediastinoscopies. The video-assisted technique can significantly improve the results of mediastinoscopy. A thorough education on the modern video-assisted technique is mandatory for thoracic surgeons until they can fully exhaust its potential.

  3. Improvement technique of sensitized HAZ by GTAW cladding applied to a BWR power plant

    International Nuclear Information System (INIS)

    Tujimura, Hiroshi; Tamai, Yasumasa; Furukawa, Hideyasu; Kurosawa, Kouichi; Chiba, Isao; Nomura, Keiichi.

    1995-01-01

    A SCC(Stress Corrosion Cracking)-resistant technique, in which the sleeve installed by expansion is melted by GTAW process without filler metal with outside water cooling, was developed. The technique was applied to ICM (In-Core Monitor) housings of a BWR power plant in 1993. The ICM housings of which materials are type 304 Stainless Steels are sensitized with high tensile residual stresses by welding to the RPV (Reactor Pressure Vessel). As the result, ICM housings have potential of SCC initiation. Therefore, the improvement technique resistant to SCC was needed. The technique can improve chemical composition of the housing inside and residual stresses of the housing outside at the same time. Sensitization of the housing inner surface area is eliminated by replacing low-carbon with proper-ferrite microstructure clad. High tensile residual stresses of housing outside surface area is improved into compressive side. Compressive stresses of outside surface are induced by thermal stresses which are caused by inside cladding with outside water cooling. The clad is required to be low-carbon metal with proper ferrite and not to have the new sensitized HAZ (Heat Affected Zone) on the surface by cladding. The effect of the technique was qualified by SCC test, chemical composition check, ferrite content measurement and residual stresses measurement etc. All equipment for remote application were developed and qualified, too. The technique was successfully applied to a BWR plant after sufficient training

  4. Applied research of environmental monitoring using instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Young Sam; Moon, Jong Hwa; Chung, Young Ju

    1997-08-01

    This technical report is written as a guide book for applied research of environmental monitoring using Instrumental Neutron Activation Analysis. The contents are as followings; sampling and sample preparation as a airborne particulate matter, analytical methodologies, data evaluation and interpretation, basic statistical methods of data analysis applied in environmental pollution studies. (author). 23 refs., 7 tabs., 9 figs.

  5. Statistical Techniques Used in Three Applied Linguistics Journals: "Language Learning,""Applied Linguistics" and "TESOL Quarterly," 1980-1986: Implications for Readers and Researchers.

    Science.gov (United States)

    Teleni, Vicki; Baldauf, Richard B., Jr.

    A study investigated the statistical techniques used by applied linguists and reported in three journals, "Language Learning,""Applied Linguistics," and "TESOL Quarterly," between 1980 and 1986. It was found that 47% of the published articles used statistical procedures. In these articles, 63% of the techniques used could be called basic, 28%…

  6. The impact of applying product-modelling techniques in configurator projects

    DEFF Research Database (Denmark)

    Hvam, Lars; Kristjansdottir, Katrin; Shafiee, Sara

    2018-01-01

    This paper aims to increase understanding of the impact of using product-modelling techniques to structure and formalise knowledge in configurator projects. Companies that provide customised products increasingly apply configurators in support of sales and design activities, reaping benefits...... that include shorter lead times, improved quality of specifications and products, and lower overall product costs. The design and implementation of configurators are a challenging task that calls for scientifically based modelling techniques to support the formal representation of configurator knowledge. Even...... the phenomenon model and information model are considered visually, (2) non-UML-based modelling techniques, in which only the phenomenon model is considered and (3) non-formal modelling techniques. This study analyses the impact to companies from increased availability of product knowledge and improved control...

  7. Applied potential tomography. A new noninvasive technique for measuring gastric emptying

    International Nuclear Information System (INIS)

    Avill, R.; Mangnall, Y.F.; Bird, N.C.; Brown, B.H.; Barber, D.C.; Seagar, A.D.; Johnson, A.G.; Read, N.W.

    1987-01-01

    Applied potential tomography is a new, noninvasive technique that yields sequential images of the resistivity of gastric contents after subjects have ingested a liquid or semisolid meal. This study validates the technique as a means of measuring gastric emptying. Experiments in vitro showed an excellent correlation between measurements of resistivity and either the square of the radius of a glass rod or the volume of water in a spherical balloon when both were placed in an oval tank containing saline. Altering the lateral position of the rod in the tank did not alter the values obtained. Images of abdominal resistivity were also directly correlated with the volume of air in a gastric balloon. Profiles of gastric emptying of liquid meals obtained using applied potential tomography were very similar to those obtained using scintigraphy or dye dilution techniques, provided that acid secretion was inhibited by cimetidine. Profiles of emptying of a mashed potato meal using applied potential tomography were also very similar to those obtained by scintigraphy. Measurements of the emptying of a liquid meal from the stomach were reproducible if acid secretion was inhibited by cimetidine. Thus, applied potential tomography is an accurate and reproducible method of measuring gastric emptying of liquids and particulate food. It is inexpensive, well tolerated, easy to use, and ideally suited for multiple studies in patients, even those who are pregnant

  8. Applied potential tomography. A new noninvasive technique for measuring gastric emptying

    Energy Technology Data Exchange (ETDEWEB)

    Avill, R.; Mangnall, Y.F.; Bird, N.C.; Brown, B.H.; Barber, D.C.; Seagar, A.D.; Johnson, A.G.; Read, N.W.

    1987-04-01

    Applied potential tomography is a new, noninvasive technique that yields sequential images of the resistivity of gastric contents after subjects have ingested a liquid or semisolid meal. This study validates the technique as a means of measuring gastric emptying. Experiments in vitro showed an excellent correlation between measurements of resistivity and either the square of the radius of a glass rod or the volume of water in a spherical balloon when both were placed in an oval tank containing saline. Altering the lateral position of the rod in the tank did not alter the values obtained. Images of abdominal resistivity were also directly correlated with the volume of air in a gastric balloon. Profiles of gastric emptying of liquid meals obtained using applied potential tomography were very similar to those obtained using scintigraphy or dye dilution techniques, provided that acid secretion was inhibited by cimetidine. Profiles of emptying of a mashed potato meal using applied potential tomography were also very similar to those obtained by scintigraphy. Measurements of the emptying of a liquid meal from the stomach were reproducible if acid secretion was inhibited by cimetidine. Thus, applied potential tomography is an accurate and reproducible method of measuring gastric emptying of liquids and particulate food. It is inexpensive, well tolerated, easy to use, and ideally suited for multiple studies in patients, even those who are pregnant.

  9. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  10. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  11. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    Science.gov (United States)

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  12. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  13. Modelling the effects of the sterile insect technique applied to Eldana saccharina Walker in sugarcane

    Directory of Open Access Journals (Sweden)

    L Potgieter

    2012-12-01

    Full Text Available A mathematical model is formulated for the population dynamics of an Eldana saccharina Walker infestation of sugarcane under the influence of partially sterile released insects. The model describes the population growth of and interaction between normal and sterile E.saccharina moths in a temporally variable, but spatially homogeneous environment. The model consists of a deterministic system of difference equations subject to strictly positive initial data. The primary objective of this model is to determine suitable parameters in terms of which the above population growth and interaction may be quantified and according to which E.saccharina infestation levels and the associated sugarcane damage may be measured. Although many models have been formulated in the past describing the sterile insect technique, few of these models describe the technique for Lepidopteran species with more than one life stage and where F1-sterility is relevant. In addition, none of these models consider the technique when fully sterile females and partially sterile males are being released. The model formulated is also the first to describe the technique applied specifically to E.saccharina, and to consider the economic viability of applying the technique to this species. Pertinent decision support is provided to farm managers in terms of the best timing for releases, release ratios and release frequencies.

  14. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  15. Renormalization techniques applied to the study of density of states in disordered systems

    International Nuclear Information System (INIS)

    Ramirez Ibanez, J.

    1985-01-01

    A general scheme for real space renormalization of formal scattering theory is presented and applied to the calculation of density of states (DOS) in some finite width systems. This technique is extended in a self-consistent way, to the treatment of disordered and partially ordered chains. Numerical results of moments and DOS are presented in comparison with previous calculations. In addition, a self-consistent theory for the magnetic order problem in a Hubbard chain is derived and a parametric transition is observed. Properties of localization of the electronic states in disordered chains are studied through various decimation averaging techniques and using numerical simulations. (author) [pt

  16. Applying homotopy analysis method for solving differential-difference equation

    International Nuclear Information System (INIS)

    Wang Zhen; Zou Li; Zhang Hongqing

    2007-01-01

    In this Letter, we apply the homotopy analysis method to solving the differential-difference equations. A simple but typical example is applied to illustrate the validity and the great potential of the generalized homotopy analysis method in solving differential-difference equation. Comparisons are made between the results of the proposed method and exact solutions. The results show that the homotopy analysis method is an attractive method in solving the differential-difference equations

  17. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    OpenAIRE

    Amany AlShawi

    2016-01-01

    Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers...

  18. Using wavelet denoising and mathematical morphology in the segmentation technique applied to blood cells images

    OpenAIRE

    Boix García, Macarena; Cantó Colomina, Begoña

    2013-01-01

    Accurate image segmentation is used in medical diagnosis since this technique is a noninvasive pre-processing step for biomedical treatment. In this work we present an efficient segmentation method for medical image analysis. In particular, with this method blood cells can be segmented. For that, we combine the wavelet transform with morphological operations. Moreover, the wavelet thresholding technique is used to eliminate the noise and prepare the image for suitable segmentation. In wavelet...

  19. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    Energy Technology Data Exchange (ETDEWEB)

    Credille, Jennifer [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States); Owens, Elizabeth [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States)

    2017-10-11

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restricted to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.

  20. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Development of technique to apply induction heating stress improvement to recirculation inlet nozzle

    International Nuclear Information System (INIS)

    Chiba, Kunihiko; Nihei, Kenichi; Ootaka, Minoru

    2009-01-01

    Stress corrosion cracking (SCC) have been found in the primary loop recirculation (PLR) systems of boiling water reactors (BWR). Residual stress in welding heat-affected zone is one of the factors of SCC, and the residual stress improvement is one of the most effective methods to prevent SCC. Induction heating stress improvement (IHSI) is one of the techniques to improve reduce residual stress. However, it is difficult to apply IHSI to the place such as the recirculation inlet nozzle where the flow stagnates. In this present study, the technique to apply IHSI to the recirculation inlet nozzle was developed using water jet which blowed into the crevice between the nozzle safe end and the thermal sleeve. (author)

  2. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  3. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  4. Evaluation of Economic Merger Control Techniques Applied to the European Electricity Sector

    International Nuclear Information System (INIS)

    Vandezande, Leen; Meeus, Leonardo; Delvaux, Bram; Van Calster, Geert; Belmans, Ronnie

    2006-01-01

    With European electricity markets not yet functioning on a competitive basis and consolidation increasing, the European Commission has said it intends to more intensively apply competition law in the electricity sector. Yet economic techniques and theories used in EC merger control fail to take sufficiently into account some specific features of electricity markets. The authors offer suggestions to enhance their reliability and applicability in the electricity sector. (author)

  5. Just-in-Time techniques as applied to hazardous materials management

    OpenAIRE

    Spicer, John S.

    1996-01-01

    Approved for public release; distribution is unlimited This study investigates the feasibility of integrating JIT techniques in the context of hazardous materials management. This study provides a description of JIT, a description of environmental compliance issues and the outgrowth of related HAZMAT policies, and a broad perspective on strategies for applying JIT to HAZMAT management. http://archive.org/details/justintimetechn00spic Lieutenant Commander, United States Navy

  6. Applying data-mining techniques in honeypot analysis

    CSIR Research Space (South Africa)

    Veerasamy, N

    2006-07-01

    Full Text Available will be set up which will be the medium of data collection. A honeypot setup requires substantial design and understanding of network technologies as well as the necessary software and different configuration options. Machine, IDS and the logging capability... of the IDS and additional machines to form a network will also have to be incorporated. A preliminary conceptual representation of the setup is shown in Figure 3. Figure 3: Preliminary Conceptual Honeypot Setup A honeypot machine, IDS, logging machine...

  7. The digital geometric phase technique applied to the deformation evaluation of MEMS devices

    International Nuclear Information System (INIS)

    Liu, Z W; Xie, H M; Gu, C Z; Meng, Y G

    2009-01-01

    Quantitative evaluation of the structure deformation of microfabricated electromechanical systems is of importance for the design and functional control of microsystems. In this investigation, a novel digital geometric phase technique was developed to meet the deformation evaluation requirement of microelectromechanical systems (MEMS). The technique is performed on the basis of regular artificial lattices, instead of a natural atom lattice. The regular artificial lattices with a pitch ranging from micrometer to nanometer will be directly fabricated on the measured surface of MEMS devices by using a focused ion beam (FIB). Phase information can be obtained from the Bragg filtered images after fast Fourier transform (FFT) and inverse fast Fourier transform (IFFT) of the scanning electronic microscope (SEM) images. Then the in-plane displacement field and the local strain field related to the phase information will be evaluated. The obtained results show that the technique can be well applied to deformation measurement with nanometer sensitivity and stiction force estimation of a MEMS device

  8. Evaluation of irradiation damage effect by applying electric properties based techniques

    International Nuclear Information System (INIS)

    Acosta, B.; Sevini, F.

    2004-01-01

    The most important effect of the degradation by radiation is the decrease in the ductility of the pressure vessel of the reactor (RPV) ferritic steels. The main way to determine the mechanical behaviour of the RPV steels is tensile and impact tests, from which the ductile to brittle transition temperature (DBTT) and its increase due to neutron irradiation can be calculated. These tests are destructive and regularly applied to surveillance specimens to assess the integrity of RPV. The possibility of applying validated non-destructive ageing monitoring techniques would however facilitate the surveillance of the materials that form the reactor vessel. The JRC-IE has developed two devices, focused on the measurement of the electrical properties to assess non-destructively the embrittlement state of materials. The first technique, called Seebeck and Thomson Effects on Aged Material (STEAM), is based on the measurement of the Seebeck coefficient, characteristic of the material and related to the microstructural changes induced by irradiation embrittlement. With the same aim the second technique, named Resistivity Effects on Aged Material (REAM), measures instead the resistivity of the material. The purpose of this research is to correlate the results of the impact tests, STEAM and REAM measurements with the change in the mechanical properties due to neutron irradiation. These results will make possible the improvement of such techniques based on the measurement of material electrical properties for their application to the irradiation embrittlement assessment

  9. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  10. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  11. Animal Research in the "Journal of Applied Behavior Analysis"

    Science.gov (United States)

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  12. Dimensional Analysis with space discrimination applied to Fickian difussion phenomena

    International Nuclear Information System (INIS)

    Diaz Sanchidrian, C.; Castans, M.

    1989-01-01

    Dimensional Analysis with space discrimination is applied to Fickian difussion phenomena in order to transform its partial differen-tial equations into ordinary ones, and also to obtain in a dimensionl-ess fom the Ficks second law. (Author)

  13. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  14. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  15. Super Resolution and Interference Suppression Technique applied to SHARAD Radar Data

    Science.gov (United States)

    Raguso, M. C.; Mastrogiuseppe, M.; Seu, R.; Piazzo, L.

    2017-12-01

    We will present a super resolution and interference suppression technique applied to the data acquired by the SHAllow RADar (SHARAD) on board the NASA's 2005 Mars Reconnaissance Orbiter (MRO) mission, currently operating around Mars [1]. The algorithms allow to improve the range resolution roughly by a factor of 3 and the Signal to Noise Ratio (SNR) by a several decibels. Range compression algorithms usually adopt conventional Fourier transform techniques, which are limited in the resolution by the transmitted signal bandwidth, analogous to the Rayleigh's criterion in optics. In this work, we investigate a super resolution method based on autoregressive models and linear prediction techniques [2]. Starting from the estimation of the linear prediction coefficients from the spectral data, the algorithm performs the radar bandwidth extrapolation (BWE), thereby improving the range resolution of the pulse-compressed coherent radar data. Moreover, the EMIs (ElectroMagnetic Interferences) are detected and the spectra is interpolated in order to reconstruct an interference free spectrum, thereby improving the SNR. The algorithm can be applied to the single complex look image after synthetic aperture processing (SAR). We apply the proposed algorithm to simulated as well as to real radar data. We will demonstrate the effective enhancement on vertical resolution with respect to the classical spectral estimator. We will show that the imaging of the subsurface layered structures observed in radargrams is improved, allowing additional insights for the scientific community in the interpretation of the SHARAD radar data, which will help to further our understanding of the formation and evolution of known geological features on Mars. References: [1] Seu et al. 2007, Science, 2007, 317, 1715-1718 [2] K.M. Cuomo, "A Bandwidth Extrapolation Technique for Improved Range Resolution of Coherent Radar Data", Project Report CJP-60, Revision 1, MIT Lincoln Laboratory (4 Dec. 1992).

  16. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  17. Applying importance-performance analysis to patient safety culture.

    Science.gov (United States)

    Lee, Yii-Ching; Wu, Hsin-Hung; Hsieh, Wan-Lin; Weng, Shao-Jen; Hsieh, Liang-Po; Huang, Chih-Hsuan

    2015-01-01

    The Sexton et al.'s (2006) safety attitudes questionnaire (SAQ) has been widely used to assess staff's attitudes towards patient safety in healthcare organizations. However, to date there have been few studies that discuss the perceptions of patient safety both from hospital staff and upper management. The purpose of this paper is to improve and to develop better strategies regarding patient safety in healthcare organizations. The Chinese version of SAQ based on the Taiwan Joint Commission on Hospital Accreditation is used to evaluate the perceptions of hospital staff. The current study then lies in applying importance-performance analysis technique to identify the major strengths and weaknesses of the safety culture. The results show that teamwork climate, safety climate, job satisfaction, stress recognition and working conditions are major strengths and should be maintained in order to provide a better patient safety culture. On the contrary, perceptions of management and hospital handoffs and transitions are important weaknesses and should be improved immediately. Research limitations/implications - The research is restricted in generalizability. The assessment of hospital staff in patient safety culture is physicians and registered nurses. It would be interesting to further evaluate other staff's (e.g. technicians, pharmacists and others) opinions regarding patient safety culture in the hospital. Few studies have clearly evaluated the perceptions of healthcare organization management regarding patient safety culture. Healthcare managers enable to take more effective actions to improve the level of patient safety by investigating key characteristics (either strengths or weaknesses) that healthcare organizations should focus on.

  18. Wire-mesh and ultrasound techniques applied for the characterization of gas-liquid slug flow

    Energy Technology Data Exchange (ETDEWEB)

    Ofuchi, Cesar Y.; Sieczkowski, Wytila Chagas; Neves Junior, Flavio; Arruda, Lucia V.R.; Morales, Rigoberto E.M.; Amaral, Carlos E.F.; Silva, Marco J. da [Federal University of Technology of Parana, Curitiba, PR (Brazil)], e-mails: ofuchi@utfpr.edu.br, wytila@utfpr.edu.br, neves@utfpr.edu.br, lvrarruda@utfpr.edu.br, rmorales@utfpr.edu.br, camaral@utfpr.edu.br, mdasilva@utfpr.edu.br

    2010-07-01

    Gas-liquid two-phase flows are found in a broad range of industrial applications, such as chemical, petrochemical and nuclear industries and quite often determine the efficiency and safety of process and plants. Several experimental techniques have been proposed and applied to measure and quantify two-phase flows so far. In this experimental study the wire-mesh sensor and an ultrasound technique are used and comparatively evaluated to study two-phase slug flows in horizontal pipes. The wire-mesh is an imaging technique and thus appropriated for scientific studies while ultrasound-based technique is robust and non-intrusive and hence well suited for industrial applications. Based on the measured raw data it is possible to extract some specific slug flow parameters of interest such as mean void fraction and characteristic frequency. The experiments were performed in the Thermal Sciences Laboratory (LACIT) at UTFPR, Brazil, in which an experimental two-phase flow loop is available. The experimental flow loop comprises a horizontal acrylic pipe of 26 mm diameter and 9 m length. Water and air were used to produce the two phase flow under controlled conditions. The results show good agreement between the techniques. (author)

  19. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  20. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  1. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  2. The correlated k-distribution technique as applied to the AVHRR channels

    Science.gov (United States)

    Kratz, David P.

    1995-01-01

    Correlated k-distributions have been created to account for the molecular absorption found in the spectral ranges of the five Advanced Very High Resolution Radiometer (AVHRR) satellite channels. The production of the k-distributions was based upon an exponential-sum fitting of transmissions (ESFT) technique which was applied to reference line-by-line absorptance calculations. To account for the overlap of spectral features from different molecular species, the present routines made use of the multiplication transmissivity property which allows for considerable flexibility, especially when altering relative mixing ratios of the various molecular species. To determine the accuracy of the correlated k-distribution technique as compared to the line-by-line procedure, atmospheric flux and heating rate calculations were run for a wide variety of atmospheric conditions. For the atmospheric conditions taken into consideration, the correlated k-distribution technique has yielded results within about 0.5% for both the cases where the satellite spectral response functions were applied and where they were not. The correlated k-distribution's principal advantages is that it can be incorporated directly into multiple scattering routines that consider scattering as well as absorption by clouds and aerosol particles.

  3. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  4. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  5. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  6. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  7. The Significance of Regional Analysis in Applied Geography.

    Science.gov (United States)

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  8. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  9. Mathematical Model and Artificial Intelligent Techniques Applied to a Milk Industry through DSM

    Science.gov (United States)

    Babu, P. Ravi; Divya, V. P. Sree

    2011-08-01

    The resources for electrical energy are depleting and hence the gap between the supply and the demand is continuously increasing. Under such circumstances, the option left is optimal utilization of available energy resources. The main objective of this chapter is to discuss about the Peak load management and overcome the problems associated with it in processing industries such as Milk industry with the help of DSM techniques. The chapter presents a generalized mathematical model for minimizing the total operating cost of the industry subject to the constraints. The work presented in this chapter also deals with the results of application of Neural Network, Fuzzy Logic and Demand Side Management (DSM) techniques applied to a medium scale milk industrial consumer in India to achieve the improvement in load factor, reduction in Maximum Demand (MD) and also the consumer gets saving in the energy bill.

  10. Applied methods and techniques for mechatronic systems modelling, identification and control

    CERN Document Server

    Zhu, Quanmin; Cheng, Lei; Wang, Yongji; Zhao, Dongya

    2014-01-01

    Applied Methods and Techniques for Mechatronic Systems brings together the relevant studies in mechatronic systems with the latest research from interdisciplinary theoretical studies, computational algorithm development and exemplary applications. Readers can easily tailor the techniques in this book to accommodate their ad hoc applications. The clear structure of each paper, background - motivation - quantitative development (equations) - case studies/illustration/tutorial (curve, table, etc.) is also helpful. It is mainly aimed at graduate students, professors and academic researchers in related fields, but it will also be helpful to engineers and scientists from industry. Lei Liu is a lecturer at Huazhong University of Science and Technology (HUST), China; Quanmin Zhu is a professor at University of the West of England, UK; Lei Cheng is an associate professor at Wuhan University of Science and Technology, China; Yongji Wang is a professor at HUST; Dongya Zhao is an associate professor at China University o...

  11. Software engineering techniques applied to agricultural systems an object-oriented and UML approach

    CERN Document Server

    Papajorgji, Petraq J

    2014-01-01

    Software Engineering Techniques Applied to Agricultural Systems presents cutting-edge software engineering techniques for designing and implementing better agricultural software systems based on the object-oriented paradigm and the Unified Modeling Language (UML). The focus is on the presentation of  rigorous step-by-step approaches for modeling flexible agricultural and environmental systems, starting with a conceptual diagram representing elements of the system and their relationships. Furthermore, diagrams such as sequential and collaboration diagrams are used to explain the dynamic and static aspects of the software system.    This second edition includes: a new chapter on Object Constraint Language (OCL), a new section dedicated to the Model-VIEW-Controller (MVC) design pattern, new chapters presenting details of two MDA-based tools – the Virtual Enterprise and Olivia Nova, and a new chapter with exercises on conceptual modeling.  It may be highly useful to undergraduate and graduate students as t...

  12. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    Directory of Open Access Journals (Sweden)

    Amany AlShawi

    2016-01-01

    Full Text Available Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers, vendors, data distributors, and others. Further, data objects entered into the single cache system can be extended into 12 components. Database and SPSS modelers can be used to implement the same.

  13. Markov chain Monte Carlo techniques applied to parton distribution functions determination: Proof of concept

    Science.gov (United States)

    Gbedo, Yémalin Gabin; Mangin-Brinet, Mariane

    2017-07-01

    We present a new procedure to determine parton distribution functions (PDFs), based on Markov chain Monte Carlo (MCMC) methods. The aim of this paper is to show that we can replace the standard χ2 minimization by procedures grounded on statistical methods, and on Bayesian inference in particular, thus offering additional insight into the rich field of PDFs determination. After a basic introduction to these techniques, we introduce the algorithm we have chosen to implement—namely Hybrid (or Hamiltonian) Monte Carlo. This algorithm, initially developed for Lattice QCD, turns out to be very interesting when applied to PDFs determination by global analyses; we show that it allows us to circumvent the difficulties due to the high dimensionality of the problem, in particular concerning the acceptance. A first feasibility study is performed and presented, which indicates that Markov chain Monte Carlo can successfully be applied to the extraction of PDFs and of their uncertainties.

  14. Removal of benzaldehyde from a water/ethanol mixture by applying scavenging techniques

    DEFF Research Database (Denmark)

    Mitic, Aleksandar; Skov, Thomas; Gernaey, Krist V.

    2017-01-01

    A presence of carbonyl compounds is very common in the food industry. The nature of such compounds is to be reactive and thus many products involve aldehydes/ketones in their synthetic routes. By contrast, the high reactivity of carbonyl compounds could also lead to formation of undesired compounds......, such as genotoxic impurities. It can therefore be important to remove carbonyl compounds by implementing suitable removal techniques, with the aim of protecting final product quality. This work is focused on benzaldehyde as a model component, studying its removal from a water/ethanol mixture by applying different...

  15. Signed directed social network analysis applied to group conflict

    DEFF Research Database (Denmark)

    Zheng, Quan; Skillicorn, David; Walther, Olivier

    2015-01-01

    Real-world social networks contain relationships of multiple different types, but this richness is often ignored in graph-theoretic modelling. We show how two recently developed spectral embedding techniques, for directed graphs (relationships are asymmetric) and for signed graphs (relationships...... are both positive and negative), can be combined. This combination is particularly appropriate for intelligence, terrorism, and law enforcement applications. We illustrate by applying the novel embedding technique to datasets describing conflict in North-West Africa, and show how unusual interactions can...

  16. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  17. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  18. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  19. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  20. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  1. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  2. Heisenberg principle applied to the analysis of speckle interferometry fringes

    Science.gov (United States)

    Sciammarella, C. A.; Sciammarella, F. M.

    2003-11-01

    Optical techniques that are used to measure displacements utilize a carrier. When a load is applied the displacement field modulates the carrier. The accuracy of the information that can be recovered from the modulated carrier is limited by a number of factors. In this paper, these factors are analyzed and conclusions concerning the limitations in information recovery are illustrated with examples taken from experimental data.

  3. Metal oxide collectors for storing matter technique applied in secondary ion mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Miśnik, Maciej [Institute of Tele and Radio Technology, ul. Ratuszowa 11, 03-450 Warszawa (Poland); Gdańsk University of Technology (Poland); Konarski, Piotr [Institute of Tele and Radio Technology, ul. Ratuszowa 11, 03-450 Warszawa (Poland); Zawada, Aleksander [Institute of Tele and Radio Technology, ul. Ratuszowa 11, 03-450 Warszawa (Poland); Military University of Technology, Warszawa (Poland)

    2016-03-15

    We present results of the use of metal and metal oxide substrates that serve as collectors in ‘storing matter’, the quantitative technique of secondary ion mass spectrometry (SIMS). This technique allows separating the two base processes of secondary ion formation in SIMS. Namely, the process of ion sputtering is separated from the process of ionisation. The technique allows sputtering of the analysed sample and storing the sputtered material, with sub-monolayer coverage, onto a collector surface. Such deposits can be then analysed by SIMS, and as a result, the so called ‘matrix effects’ are significantly reduced. We perform deposition of the sputtered material onto Ti and Cu substrates and also onto metal oxide substrates as molybdenum, titanium, tin and indium oxides. The process of sputtering is carried within the same vacuum chamber where the SIMS analysis of the collected material is performed. For sputtering and SIMS analysis of the deposited material we use 5 keV Ar{sup +} beam of 500 nA. The presented results are obtained with the use of stationary collectors. Here we present a case study of chromium. The obtained results show that the molybdenum and titanium oxide substrates used as collectors increase useful yield by two orders, with respect to such pure elemental collectors as Cu and Ti. Here we define useful yield as a ratio of the number of detected secondary ions during SIMS analysis and the number of atoms sputtered during the deposition process.

  4. Zero order and signal processing spectrophotometric techniques applied for resolving interference of metronidazole with ciprofloxacin in their pharmaceutical dosage form.

    Science.gov (United States)

    Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed

    2016-02-05

    Four rapid, simple, accurate and precise spectrophotometric methods were used for the determination of ciprofloxacin in the presence of metronidazole as interference. The methods under study are area under the curve, simultaneous equation in addition to smart signal processing techniques of manipulating ratio spectra namely Savitsky-Golay filters and continuous wavelet transform. All the methods were validated according to the ICH guidelines where accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can therefore be used for the routine analysis of ciprofloxacin in quality-control laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  6. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  7. Recent developments and evaluation of selected geochemical techniques applied to uranium exploration

    International Nuclear Information System (INIS)

    Wenrich-Verbeek, K.J.; Cadigan, R.A.; Felmlee, J.K.; Reimer, G.M.; Spirakis, C.S.

    1976-01-01

    Various geochemical techniques for uranium exploration are currently under study by the geochemical techniques team of the Branch of Uranium and Thorium Resources, US Geological Survey. Radium-226 and its parent uranium-238 occur in mineral spring water largely independently of the geochemistry of the solutions and thus are potential indicators of uranium in source rocks. Many radioactive springs, hot or cold, are believed to be related to hydrothermal systems which contain uranium at depth. Radium, when present in the water, is co-precipitated in iron and/or manganese oxides and hydroxides or in barium sulphate associated with calcium carbonate spring deposits. Studies of surface water samples have resulted in improved standardized sample treatment and collection procedures. Stream discharge has been shown to have a significant effect on uranium concentration, while conductivity shows promise as a ''pathfinder'' for uranium. Turbid samples behave differently and consequently must be treated with more caution than samples from clear streams. Both water and stream sediments should be sampled concurrently, as anomalous uranium concentrations may occur in only one of these media and would be overlooked if only one, the wrong one, were analysed. The fission-track technique has been applied to uranium determinations in the above water studies. The advantages of the designed sample collecting system are that only a small quantity, typically one drop, of water is required and sample manipulation is minimized, thereby reducing contamination risks. The fission-track analytical technique is effective at the uranium concentration levels commonly found in natural waters (5.0-0.01 μg/litre). Landsat data were used to detect alteration associated with uranium deposits. Altered areas were detected but were not uniquely defined. Nevertheless, computer processing of Landsat data did suggest a smaller size target for further evaluation and thus is useful as an exploration tool

  8. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  9. Animal research in the Journal of Applied Behavior Analysis.

    Science.gov (United States)

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications.

  10. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  11. Progressive-Ratio Schedules and Applied Behavior Analysis

    Science.gov (United States)

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  12. B. F. Skinner's Contributions to Applied Behavior Analysis

    Science.gov (United States)

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  13. Applied Behavior Analysis: Current Myths in Public Education

    Science.gov (United States)

    Fielding, Cheryl; Lowdermilk, John; Lanier, Lauren L.; Fannin, Abigail G.; Schkade, Jennifer L.; Rose, Chad A.; Simpson, Cynthia G.

    2013-01-01

    The effective use of behavior management strategies and related policies continues to be a debated issue in public education. Despite overwhelming evidence espousing the benefits of the implementation of procedures derived from principles based on the science of applied behavior analysis (ABA), educators often indicate many common misconceptions…

  14. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    Science.gov (United States)

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  15. Neutron activation analysis applied to energy and environment

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1975-01-01

    Neutron activation analysis was applied to a number of problems concerned with energy production and the environment. Burning of fossil fuel, the search for new sources of uranium, possible presence of toxic elements in food and water, and the relationship of trace elements to cardiovascular disease are some of the problems in which neutron activation was used. (auth)

  16. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  17. Determination of hydrogen diffusivity and permeability in W near room temperature applying a tritium tracer technique

    International Nuclear Information System (INIS)

    Ikeda, T.; Otsuka, T.; Tanabe, T.

    2011-01-01

    Tungsten is a primary candidate of plasma facing material in ITER and beyond, owing to its good thermal property and low erosion. But hydrogen solubility and diffusivity near ITER operation temperatures (below 500 K) have scarcely studied. Mainly because its low hydrogen solubility and diffusivity at lower temperatures make the detection of hydrogen quite difficult. We have tried to observe hydrogen plasma driven permeation (PDP) through nickel and tungsten near room temperatures applying a tritium tracer technique, which is extremely sensible to detect tritium diluted in hydrogen. The apparent diffusion coefficients for PDP were determined by permeation lag times at first time, and those for nickel and tungsten were similar or a few times larger than those for gas driven permeation (GDP). The permeation rates for PDP in nickel and tungsten were larger than those for GDP normalized to the same gas pressure about 20 and 5 times larger, respectively.

  18. Laser--Doppler anemometry technique applied to two-phase dispersed flows in a rectangular channel

    International Nuclear Information System (INIS)

    Lee, S.L.; Srinivasan, J.

    1979-01-01

    A new optical technique using Laser--Doppler anemometry has been applied to the local measurement of turbulent upward flow of a dilute water droplet--air two-phase dispersion in a vertical rectangular channel. Individually examined were over 20,000 droplet signals coming from each of a total of ten transversely placed measuring points, the closest of which to the channel wall was 250 μ away from the wall. Two flows of different patterns due to different imposed flow conditions were investigated, one with and the other without a liquid film formed on the channel wall. Reported are the size and number density distribution and the axial and lateral velocity distributions for the droplets as well as the axial and lateral velocity distributions for the air

  19. Technique of uranium exploration in tropical rain forests as applied in Sumatra and other tropical areas

    International Nuclear Information System (INIS)

    Hahn, L.

    1983-01-01

    The technique of uranium prospecting in areas covered by tropical rain forest is discussed using a uranium exploration campaign conducted from 1976 to 1978 in Western Sumatra as an example. A regional reconnaissance survey using stream sediment samples combined with radiometric field measurements proved ideal for covering very large areas. A mobile field laboratory was used for the geochemical survey. Helicopter support in diffult terrain was found to be very efficient and economical. A field procedure for detecting low uranium concentrations in stream water samples is described. This method has been successfully applied in Sarawak. To distinguish meaningful uranium anomalies in water from those with no meaning for prospecting, the correlations between U content and conductivity of the water and between U content and Ca and HCO 3 content must be considered. This method has been used successfully in a geochemical survey in Thailand. (author)

  20. A systematic review of applying modern software engineering techniques to developing robotic systems

    Directory of Open Access Journals (Sweden)

    Claudia Pons

    2012-01-01

    Full Text Available Robots have become collaborators in our daily life. While robotic systems become more and more complex, the need to engineer their software development grows as well. The traditional approaches used in developing these software systems are reaching their limits; currently used methodologies and tools fall short of addressing the needs of such complex software development. Separating robotics’ knowledge from short-cycled implementation technologies is essential to foster reuse and maintenance. This paper presents a systematic review (SLR of the current use of modern software engineering techniques for developing robotic software systems and their actual automation level. The survey was aimed at summarizing existing evidence concerning applying such technologies to the field of robotic systems to identify any gaps in current research to suggest areas for further investigation and provide a background for positioning new research activities.

  1. Vibration monitoring/diagnostic techniques, as applied to reactor coolant pumps

    International Nuclear Information System (INIS)

    Sculthorpe, B.R.; Johnson, K.M.

    1986-01-01

    With the increased awareness of reactor coolant pump (RCP) cracked shafts, brought about by the catastrophic shaft failure at Crystal River number3, Florida Power and Light Company, in conjunction with Bently Nevada Corporation, undertook a test program at St. Lucie Nuclear Unit number2, to confirm the integrity of all four RCP pump shafts. Reactor coolant pumps play a major roll in the operation of nuclear-powered generation facilities. The time required to disassemble and physically inspect a single RCP shaft would be lengthy, monetarily costly to the utility and its customers, and cause possible unnecessary man-rem exposure to plant personnel. When properly applied, vibration instrumentation can increase unit availability/reliability, as well as provide enhanced diagnostic capability. This paper reviews monitoring benefits and diagnostic techniques applicable to RCPs/motor drives

  2. Estimates of error introduced when one-dimensional inverse heat transfer techniques are applied to multi-dimensional problems

    International Nuclear Information System (INIS)

    Lopez, C.; Koski, J.A.; Razani, A.

    2000-01-01

    A study of the errors introduced when one-dimensional inverse heat conduction techniques are applied to problems involving two-dimensional heat transfer effects was performed. The geometry used for the study was a cylinder with similar dimensions as a typical container used for the transportation of radioactive materials. The finite element analysis code MSC P/Thermal was used to generate synthetic test data that was then used as input for an inverse heat conduction code. Four different problems were considered including one with uniform flux around the outer surface of the cylinder and three with non-uniform flux applied over 360 deg C, 180 deg C, and 90 deg C sections of the outer surface of the cylinder. The Sandia One-Dimensional Direct and Inverse Thermal (SODDIT) code was used to estimate the surface heat flux of all four cases. The error analysis was performed by comparing the results from SODDIT and the heat flux calculated based on the temperature results obtained from P/Thermal. Results showed an increase in error of the surface heat flux estimates as the applied heat became more localized. For the uniform case, SODDIT provided heat flux estimates with a maximum error of 0.5% whereas for the non-uniform cases, the maximum errors were found to be about 3%, 7%, and 18% for the 360 deg C, 180 deg C, and 90 deg C cases, respectively

  3. Structural reliability analysis applied to pipeline risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gardiner, M. [GL Industrial Services, Loughborough (United Kingdom); Mendes, Renato F.; Donato, Guilherme V.P. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Quantitative Risk Assessment (QRA) of pipelines requires two main components to be provided. These are models of the consequences that follow from some loss of containment incident, and models for the likelihood of such incidents occurring. This paper describes how PETROBRAS have used Structural Reliability Analysis for the second of these, to provide pipeline- and location-specific predictions of failure frequency for a number of pipeline assets. This paper presents an approach to estimating failure rates for liquid and gas pipelines, using Structural Reliability Analysis (SRA) to analyze the credible basic mechanisms of failure such as corrosion and mechanical damage. SRA is a probabilistic limit state method: for a given failure mechanism it quantifies the uncertainty in parameters to mathematical models of the load-resistance state of a structure and then evaluates the probability of load exceeding resistance. SRA can be used to benefit the pipeline risk management process by optimizing in-line inspection schedules, and as part of the design process for new construction in pipeline rights of way that already contain multiple lines. A case study is presented to show how the SRA approach has recently been used on PETROBRAS pipelines and the benefits obtained from it. (author)

  4. GORE PRECLUDE MVP dura substitute applied as a nonwatertight "underlay" graft for craniotomies: product and technique evaluation.

    Science.gov (United States)

    Chappell, E Thomas; Pare, Laura; Salehpour, Mohammed; Mathews, Marlon; Middlehof, Charles

    2009-01-01

    While watertight closure of the dura is a long-standing tenet of cranial surgery, it is often not possible and sometimes unnecessary. Many graft materials with various attributes and drawbacks have been in use for many years. A novel synthetic dural graft material called GORE PRECLUDE MVP dura substitute (WL Gore & Associates, Inc, Flagstaff, Ariz) (henceforth called "MVP") is designed for use both in traditional watertight dural closure and as a dural "underlay" graft in a nonwatertight fashion. One surface of MVP is engineered to facilitate fibroblast in-growth so that its proximity to the underside of the dura will lead to rapid incorporation, whereas the other surface acts as a barrier to reduce tissue adhesion to the device. A series of 59 human subjects undergoing craniotomy and available for clinical and radiographic follow-up underwent nonwatertight underlay grafting of their durotomy with MVP. This is an assessment of the specific product and technique. No attempt is made to compare this to other products or techniques. The mean follow-up in this group was more than 4 months. All subjects have ultimately experienced excellent outcomes related to use of the graft implanted with the underlay technique. No complications occurred related directly to MVP, but the wound-related complication rate attributed to the underlay technique was higher than expected (17%). However, careful analysis found a high rate of risk factors for wound complications and determined that complications with the underlay technique could be avoided by assuring close approximation of the graft material to the underside of the dura. MVP can be used as an underlay graft in a nonwatertight fashion. However, if used over large voids (relaxed brain or large tumor bed), "tacking" or traditional watertight closure techniques should be used. The underlay application of MVP is best applied over the convexities and is particularly well-suited to duraplasty after hemicraniectomy.

  5. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  6. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  7. Nuclear analytical techniques applied to the large scale measurements of atmospheric aerosols in the amazon region

    International Nuclear Information System (INIS)

    Gerab, Fabio

    1996-03-01

    This work presents the characterization of the atmosphere aerosol collected in different places of the Amazon Basin. We studied both the biogenic emission from the forest and the particulate material which is emitted to the atmosphere due to the large scale man-made burning during the dry season. The samples were collected during a three year period at two different locations in the Amazon, namely the Alta Floresta (MT) and Serra do Navio (AP) regions, using stacked unit filters. These regions represent two different atmospheric compositions: the aerosol is dominated by the forest natural biogenic emission at Serra do Navio, while at Alta Floresta it presents an important contribution from the man-made burning during the dry season. At Alta Floresta we took samples in gold in order to characterize mercury emission to the atmosphere related to the gold prospection activity in Amazon. Airplanes were used for aerosol sampling during the 1992 and 1993 dry seasons to characterize the atmospheric aerosol contents from man-made burning in large Amazonian areas. The samples were analyzed using several nuclear analytic techniques: Particle Induced X-ray Emission for the quantitative analysis of trace elements with atomic number above 11; Particle Induced Gamma-ray Emission for the quantitative analysis of Na; and Proton Microprobe was used for the characterization of individual particles of the aerosol. Reflectancy technique was used in the black carbon quantification, gravimetric analysis to determine the total atmospheric aerosol concentration and Cold Vapor Atomic Absorption Spectroscopy for quantitative analysis of mercury in the particulate from the Alta Floresta gold shops. Ionic chromatography was used to quantify ionic contents of aerosols from the fine mode particulate samples from Serra do Navio. Multivariate statistical analysis was used in order to identify and characterize the sources of the atmospheric aerosol present in the sampled regions. (author)

  8. Organic acid derivatization techniques applied to petroleum hydrocarbon transformations in subsurface environments

    International Nuclear Information System (INIS)

    Barcelona, M.J.; Lu, J.; Tomczak, D.M.

    1995-01-01

    Evidence for the natural microbial remediation of subsurface fuel contamination situations should include identification and analysis of transformation or degradation products. In this way, a mass balance between fuel constituents and end products may be approached to monitor cleanup progress. Application of advanced organic acid metabolite derivatization techniques to several know sites of organic compounds and fuel mixture contamination provide valuable information on the pathways and progress of microbial transformation. Good correlation between observed metabolites and transformation pathways of aromatic fuel constituents were observed at the sites

  9. High speed resonant frequency determination applied to field mapping using perturbation techniques

    International Nuclear Information System (INIS)

    Smith, B.H.; Burton, R.J.; Hutcheon, R.M.

    1992-01-01

    Perturbation techniques are commonly used for measuring electric and magnetic field distributions in resonant structures. A field measurement system has been assembled using a Hewlett Packard model 8753C network analyzer interfaced via an HPIB bus to a personal computer to form an accurate, rapid and flexible system for data acquisition, control, and analysis of such measurements. Characterization of long linac structures (up to 3 m) is accomplished in about three minutes, minimizing thermal drift effects. This paper describes the system, its application and its extension to applications such as confirming the presence of weak, off-axis quadrupole fields in an on-axis coupled linac. (Author) 5 figs., 10 refs

  10. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jeffs, S.P., E-mail: s.p.jeffs@swansea.ac.uk [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Lancaster, R.J. [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Garcia, T.E. [IUTA (University Institute of Industrial Technology of Asturias), University of Oviedo, Edificio Departamental Oeste 7.1.17, Campus Universitario, 33203 Gijón (Spain)

    2015-06-11

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k{sub SP} method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results.

  11. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    International Nuclear Information System (INIS)

    Jeffs, S.P.; Lancaster, R.J.; Garcia, T.E.

    2015-01-01

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k SP method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results

  12. VIDEOGRAMMETRIC RECONSTRUCTION APPLIED TO VOLCANOLOGY: PERSPECTIVES FOR A NEW MEASUREMENT TECHNIQUE IN VOLCANO MONITORING

    Directory of Open Access Journals (Sweden)

    Emmanuelle Cecchi

    2011-05-01

    Full Text Available This article deals with videogrammetric reconstruction of volcanic structures. As a first step, the method is tested in laboratory. The objective is to reconstruct small sand and plaster cones, analogous to volcanoes, that deform with time. The initial stage consists in modelling the sensor (internal parameters and calculating its orientation and position in space, using a multi-view calibration method. In practice two sets of views are taken: a first one around a calibration target and a second one around the studied object. Both sets are combined in the calibration software to simultaneously compute the internal parameters modelling the sensor, and the external parameters giving the spatial location of each view around the cone. Following this first stage, a N-view reconstruction process is carried out. The principle is as follows: an initial 3D model of the cone is created and then iteratively deformed to fit the real object. The deformation of the meshed model is based on a texture coherence criterion. At present, this reconstruction method and its precision are being validated at laboratory scale. The objective will be then to follow analogue model deformation with time using successive reconstructions. In the future, the method will be applied to real volcanic structures. Modifications of the initial code will certainly be required, however excellent reconstruction accuracy, valuable simplicity and flexibility of the technique are expected, compared to classic stereophotogrammetric techniques used in volcanology.

  13. How Can Synchrotron Radiation Techniques Be Applied for Detecting Microstructures in Amorphous Alloys?

    Directory of Open Access Journals (Sweden)

    Gu-Qing Guo

    2015-11-01

    Full Text Available In this work, how synchrotron radiation techniques can be applied for detecting the microstructure in metallic glass (MG is studied. The unit cells are the basic structural units in crystals, though it has been suggested that the co-existence of various clusters may be the universal structural feature in MG. Therefore, it is a challenge to detect microstructures of MG even at the short-range scale by directly using synchrotron radiation techniques, such as X-ray diffraction and X-ray absorption methods. Here, a feasible scheme is developed where some state-of-the-art synchrotron radiation-based experiments can be combined with simulations to investigate the microstructure in MG. By studying a typical MG composition (Zr70Pd30, it is found that various clusters do co-exist in its microstructure, and icosahedral-like clusters are the popular structural units. This is the structural origin where there is precipitation of an icosahedral quasicrystalline phase prior to phase transformation from glass to crystal when heating Zr70Pd30 MG.

  14. An acceleration technique for the Gauss-Seidel method applied to symmetric linear systems

    Directory of Open Access Journals (Sweden)

    Jesús Cajigas

    2014-06-01

    Full Text Available A preconditioning technique to improve the convergence of the Gauss-Seidel method applied to symmetric linear systems while preserving symmetry is proposed. The preconditioner is of the form I + K and can be applied an arbitrary number of times. It is shown that under certain conditions the application of the preconditioner a finite number of steps reduces the matrix to a diagonal. A series of numerical experiments using matrices from spatial discretizations of partial differential equations demonstrates that both versions of the preconditioner, point and block version, exhibit lower iteration counts than its non-symmetric version. Resumen. Se propone una técnica de precondicionamiento para mejorar la convergencia del método Gauss-Seidel aplicado a sistemas lineales simétricos pero preservando simetría. El precondicionador es de la forma I + K y puede ser aplicado un número arbitrario de veces. Se demuestra que bajo ciertas condiciones la aplicación del precondicionador un número finito de pasos reduce la matriz del sistema precondicionado a una diagonal. Una serie de experimentos con matrices que provienen de la discretización de ecuaciones en derivadas parciales muestra que ambas versiones del precondicionador, por punto y por bloque, muestran un menor número de iteraciones en comparación con la versión que no preserva simetría.

  15. Personnel contamination protection techniques applied during the TMI-2 [Three Mile Island Unit 2] cleanup

    International Nuclear Information System (INIS)

    Hildebrand, J.E.

    1988-01-01

    The severe damage to the Three Mile Island Unit 2 (TMI-2) core and the subsequent discharge of reactor coolant to the reactor and auxiliary buildings resulted in extremely hostile radiological environments in the TMI-2 plant. High fission product surface contamination and radiation levels necessitated the implementation of innovative techniques and methods in performing cleanup operations while assuring effective as low as reasonably achievable (ALARA) practices. The approach utilized by GPU Nuclear throughout the cleanup in applying protective clothing requirements was to consider the overall health risk to the worker including factors such as cardiopulmonary stress, visual and hearing acuity, and heat stress. In applying protective clothing requirements, trade-off considerations had to be made between preventing skin contaminations and possibly overprotecting the worker, thus impacting his ability to perform his intended task at maximum efficiency and in accordance with ALARA principles. The paper discusses the following topics: protective clothing-general use, beta protection, skin contamination, training, personnel access facility, and heat stress

  16. The application of radiotracer technique for preconcentration neutron activation analysis

    International Nuclear Information System (INIS)

    Wang Xiaolin; Chen Yinliang; Sun Ying; Fu Yibei

    1995-01-01

    The application of radiotracer technique for preconcentration neutron activation analysis (Pre-NAA) are studied and the method for determination of chemical yield of Pre-NAA is developed. This method has been applied to determination of gold, iridium and rhenium in steel and rock samples and the contents of noble metal are in the range of 1-20 ng·g -1 (sample). In addition, the accuracy difference caused by determination of chemical yield between RNAA and Pre-NAA are also discussed

  17. Applying the GNSS Volcanic Ash Plume Detection Technique to Consumer Navigation Receivers

    Science.gov (United States)

    Rainville, N.; Palo, S.; Larson, K. M.

    2017-12-01

    Global Navigation Satellite Systems (GNSS) such as the Global Positioning System (GPS) rely on predictably structured and constant power RF signals to fulfill their primary use for navigation and timing. When the received strength of GNSS signals deviates from the expected baseline, it is typically due to a change in the local environment. This can occur when signal reflections from the ground are modified by changes in snow or soil moisture content, as well as by attenuation of the signal from volcanic ash. This effect allows GNSS signals to be used as a source for passive remote sensing. Larson et al. (2017) have developed a detection technique for volcanic ash plumes based on the attenuation seen at existing geodetic GNSS sites. Since these existing networks are relatively sparse, this technique has been extended to use lower cost consumer GNSS receiver chips to enable higher density measurements of volcanic ash. These low-cost receiver chips have been integrated into a fully stand-alone sensor, with independent power, communications, and logging capabilities as part of a Volcanic Ash Plume Receiver (VAPR) network. A mesh network of these sensors transmits data to a local base-station which then streams the data real-time to a web accessible server. Initial testing of this sensor network has uncovered that a different detection approach is necessary when using consumer GNSS receivers and antennas. The techniques to filter and process the lower quality data from consumer receivers will be discussed and will be applied to initial results from a functioning VAPR network installation.

  18. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  19. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    Science.gov (United States)

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis. PMID:22477993

  20. Computational modeling applied to stress gradient analysis for metallic alloys

    International Nuclear Information System (INIS)

    Iglesias, Susana M.; Assis, Joaquim T. de; Monine, Vladimir I.

    2009-01-01

    Nowadays composite materials including materials reinforced by particles are the center of the researcher's attention. There are problems with the stress measurements in these materials, connected with the superficial stress gradient caused by the difference of the stress state of particles on the surface and in the matrix of the composite material. Computer simulation of diffraction profile formed by superficial layers of material allows simulate the diffraction experiment and gives the possibility to resolve the problem of stress measurements when the stress state is characterized by strong gradient. The aim of this paper is the application of computer simulation technique, initially developed for homogeneous materials, for diffraction line simulation of composite materials and alloys. Specifically we applied this technique for siluminum fabricated by powder metallurgy. (author)

  1. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  2. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  3. 2D and 3D optical diagnostic techniques applied to Madonna dei Fusi by Leonardo da Vinci

    Science.gov (United States)

    Fontana, R.; Gambino, M. C.; Greco, M.; Marras, L.; Materazzi, M.; Pampaloni, E.; Pelagotti, A.; Pezzati, L.; Poggi, P.; Sanapo, C.

    2005-06-01

    3D measurement and modelling have been traditionally applied to statues, buildings, archeological sites or similar large structures, but rarely to paintings. Recently, however, 3D measurements have been performed successfully also on easel paintings, allowing to detect and document the painting's surface. We used 3D models to integrate the results of various 2D imaging techniques on a common reference frame. These applications show how the 3D shape information, complemented with 2D colour maps as well as with other types of sensory data, provide the most interesting information. The 3D data acquisition was carried out by means of two devices: a high-resolution laser micro-profilometer, composed of a commercial distance meter mounted on a scanning device, and a laser-line scanner. The 2D data acquisitions were carried out using a scanning device for simultaneous RGB colour imaging and IR reflectography, and a UV fluorescence multispectral image acquisition system. We present here the results of the techniques described, applied to the analysis of an important painting of the Italian Reinassance: `Madonna dei Fusi', attributed to Leonardo da Vinci.

  4. Applied Fourier analysis from signal processing to medical imaging

    CERN Document Server

    Olson, Tim

    2017-01-01

    The first of its kind, this focused textbook serves as a self-contained resource for teaching from scratch the fundamental mathematics of Fourier analysis and illustrating some of its most current, interesting applications, including medical imaging and radar processing. Developed by the author from extensive classroom teaching experience, it provides a breadth of theory that allows students to appreciate the utility of the subject, but at as accessible a depth as possible. With myriad applications included, this book can be adapted to a one or two semester course in Fourier Analysis or serve as the basis for independent study. Applied Fourier Analysis assumes no prior knowledge of analysis from its readers, and begins by making the transition from linear algebra to functional analysis. It goes on to cover basic Fourier series and Fourier transforms before delving into applications in sampling and interpolation theory, digital communications, radar processing, medical i maging, and heat and wave equations. Fo...

  5. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  6. A practical guide to propensity score analysis for applied clinical research.

    Science.gov (United States)

    Lee, Jaehoon; Little, Todd D

    2017-11-01

    Observational studies are often the only viable options in many clinical settings, especially when it is unethical or infeasible to randomly assign participants to different treatment régimes. In such case propensity score (PS) analysis can be applied to accounting for possible selection bias and thereby addressing questions of causal inference. Many PS methods exist, yet few guidelines are available to aid applied researchers in their conduct and evaluation of a PS analysis. In this article we give an overview of available techniques for PS estimation and application, balance diagnostic, treatment effect estimation, and sensitivity assessment, as well as recent advances. We also offer a tutorial that can be used to emulate the steps of PS analysis. Our goal is to provide information that will bring PS analysis within the reach of applied clinical researchers and practitioners. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. The x-rays fluorescence applied to the analysis of alloys

    International Nuclear Information System (INIS)

    Gutierrez, D.A.

    1997-01-01

    This work is based on the utilization of the Fluorescence of X Rays. This technique of non destructive trial, has the purpose to establish a routine method, for the control of the conformation of industrial samples used. It makes an analysis with a combination of the algorithms of Rasberry-Heinrich and Claisse-Thinh. Besides, the numerical implementation of non usual techniques in this type of analysis. Such as the Linear Programming applied to the solution of super determined systems, of equations and the utilization of methods of relaxation to facilitate the convergence to the solutions. (author) [es

  8. A Methods and procedures to apply probabilistic safety Assessment (PSA) techniques to the cobalt-therapy process. Cuban experience

    International Nuclear Information System (INIS)

    Vilaragut Llanes, J.J.; Ferro Fernandez, R.; Lozano Lima, B; De la Fuente Puch, A.; Dumenigo Gonzalez, C.; Troncoso Fleitas, M.; Perez Reyes, Y.

    2003-01-01

    This paper presents the results of the Probabilistic Safety Analysis (PSA) to the Cobalt Therapy Process, which was performed as part of the International Atomic Energy Agency's Coordinated Research Project (CRP) to Investigate Appropriate Methods and Procedures to Apply Probabilistic Safety Assessment (PSA) Techniques to Large Radiation Sources. The primary methodological tools used in the analysis were Failure Modes and Effects Analysis (FMEA), Event Trees and Fault Trees. These tools were used to evaluate occupational, public and medical exposures during cobalt therapy treatment. The emphasis of the study was on the radiological protection of patients. During the course of the PSA, several findings were analysed concerning the cobalt treatment process. In relation with the Undesired Events Probabilities, the lowest exposures probabilities correspond to the public exposures during the treatment process (Z21); around 10-10 per year, being the workers exposures (Z11); around 10-4 per year. Regarding to the patient, the Z33 probabilities prevail (not desired dose to normal tissue) and Z34 (not irradiated portion to target volume). Patient accidental exposures are also classified in terms of the extent to which the error is likely to affect individual treatments, individual patients, or all the patients treated on a specific unit. Sensitivity analyses were realised to determine the influence of certain tasks or critical stages on the results. As a conclusion the study establishes that the PSA techniques may effectively and reasonably determine the risk associated to the cobalt-therapy treatment process, though there are some weaknesses in its methodological application for this kind of study requiring further research. These weaknesses are due to the fact that the traditional PSA has been mainly applied to complex hardware systems designed to operate with a high automation level, whilst the cobalt therapy treatment is a relatively simple hardware system with a

  9. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  10. Applied research on air pollution using nuclear-related analytical techniques. Report on the second research co-ordination meeting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which started in 1992, and is scheduled to run until early 1997. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XRF, and PIXE for the analysis of toxic and other trace elements in air particulate matter. The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for research and monitoring studies on air pollution, ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural area). This document reports the discussions held during the second Research Co-ordination Meeting (RCM) for the CRP which took place at ANSTO in Menai, Australia. (author)

  11. Applied research on air pollution using nuclear-related analytical techniques. Report on the second research co-ordination meeting

    International Nuclear Information System (INIS)

    1995-01-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which started in 1992, and is scheduled to run until early 1997. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XRF, and PIXE for the analysis of toxic and other trace elements in air particulate matter. The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for research and monitoring studies on air pollution, ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural area). This document reports the discussions held during the second Research Co-ordination Meeting (RCM) for the CRP which took place at ANSTO in Menai, Australia. (author)

  12. A simple pulse shape discrimination technique applied to a silicon strip detector

    International Nuclear Information System (INIS)

    Figuera, P.; Lu, J.; Amorini, F.; Cardella, G.; DiPietro, A.; Papa, M.; Musumarra, A.; Pappalardo, G.; Rizzo, F.; Tudisco, S.

    2001-01-01

    Full text: Since the early sixties, it has been known that the shape of signals from solid state detectors can be used for particle identification. Recently, this idea has been revised in a group of papers where it has been shown that the shape of current signals from solid state detectors is mainly governed by the combination of plasma erosion time and charge carrier collection time effects. We will present the results of a systematic study on a pulse shape identification method which, contrary to the techniques proposed, is based on the use of the same electronic chain normally used in the conventional time of flight technique. The method is based on the use of charge preamplifiers, low polarization voltages (i.e. just above full depletion ones), rear side injection of the incident particles, and on a proper setting of the constant fraction discriminators which enhances the dependence of the timing output on the rise time of the input signals (which depends on the charge and energy of the incident ions). The method has been applied to an annular Si strip detector with an inner radius of about 16 mm and an outer radius of about 88 mm. The detector, manufactured by Eurisys Measures (Type Ips.73.74.300.N9), is 300 microns thick and consists of 8 independent sectors each divided into 9 circular strips. On beam tests have been performed at the cyclotron of the Laboratori Nazionali del Sud in Catania using a 25.7 MeV/nucleon 58 Ni beam impinging on a 51 V and 45 Sc composite target. Excellent charge identification from H up to the Ni projectile has been observed and typical charge identification thresholds are: ∼ 1.7 MeV/nucleon for Z ≅ 6, ∼ 3.0 MeV/nucleon for Z ≅ 11, and ∼ 5.5 MeV/nucleon for Z ≅ 20. Isotope identification up to A ≅ 13 has been observed with an energy threshold of about 6 MeV/nucleon. The identification quality has been studied as a function of the constant fraction settings. The method has been applied to all the 72 independent strips

  13. Applying machine-learning techniques to Twitter data for automatic hazard-event classification.

    Science.gov (United States)

    Filgueira, R.; Bee, E. J.; Diaz-Doce, D.; Poole, J., Sr.; Singh, A.

    2017-12-01

    The constant flow of information offered by tweets provides valuable information about all sorts of events at a high temporal and spatial resolution. Over the past year we have been analyzing in real-time geological hazards/phenomenon, such as earthquakes, volcanic eruptions, landslides, floods or the aurora, as part of the GeoSocial project, by geo-locating tweets filtered by keywords in a web-map. However, not all the filtered tweets are related with hazard/phenomenon events. This work explores two classification techniques for automatic hazard-event categorization based on tweets about the "Aurora". First, tweets were filtered using aurora-related keywords, removing stop words and selecting the ones written in English. For classifying the remaining between "aurora-event" or "no-aurora-event" categories, we compared two state-of-art techniques: Support Vector Machine (SVM) and Deep Convolutional Neural Networks (CNN) algorithms. Both approaches belong to the family of supervised learning algorithms, which make predictions based on labelled training dataset. Therefore, we created a training dataset by tagging 1200 tweets between both categories. The general form of SVM is used to separate two classes by a function (kernel). We compared the performance of four different kernels (Linear Regression, Logistic Regression, Multinomial Naïve Bayesian and Stochastic Gradient Descent) provided by Scikit-Learn library using our training dataset to build the SVM classifier. The results shown that the Logistic Regression (LR) gets the best accuracy (87%). So, we selected the SVM-LR classifier to categorise a large collection of tweets using the "dispel4py" framework.Later, we developed a CNN classifier, where the first layer embeds words into low-dimensional vectors. The next layer performs convolutions over the embedded word vectors. Results from the convolutional layer are max-pooled into a long feature vector, which is classified using a softmax layer. The CNN's accuracy

  14. Advanced nondestructive techniques applied for the detection of discontinuities in aluminum foams

    Science.gov (United States)

    Katchadjian, Pablo; García, Alejandro; Brizuela, Jose; Camacho, Jorge; Chiné, Bruno; Mussi, Valerio; Britto, Ivan

    2018-04-01

    Metal foams are finding an increasing range of applications by their lightweight structure and physical, chemical and mechanical properties. Foams can be used to fill closed moulds for manufacturing structural foam parts of complex shape [1]; foam filled structures are expected to provide good mechanical properties and energy absorption capabilities. The complexity of the foaming process and the number of parameters to simultaneously control, demand a preliminary and hugely wide experimental activity to manufacture foamed components with a good quality. That is why there are many efforts to improve the structure of foams, in order to obtain a product with good properties. The problem is that even for seemingly identical foaming conditions, the effective foaming can vary significantly from one foaming trial to another. The variation of the foams often is related by structural imperfections, joining region (foam-foam or foam-wall mold) or difficulties in achieving a complete filling of the mould. That is, in a closed mold, the result of the mold filling and its structure or defects are not known a priori and can eventually vary significantly. These defects can cause a drastic deterioration of the mechanical properties [2] and lead to a low performance in its application. This work proposes the use of advanced nondestructive techniques for evaluating the foam distribution after filling the mold to improve the manufacturing process. To achieved this purpose ultrasonic technique (UT) and cone beam computed tomography (CT) were applied on plate and structures of different thicknesses filled with foam of different porosity. UT was carried out on transmission mode with low frequency air-coupled transducers [3], in focused and unfocused configurations.

  15. Situational Awareness Applied to Geology Field Mapping using Integration of Semantic Data and Visualization Techniques

    Science.gov (United States)

    Houser, P. I. Q.

    2017-12-01

    21st century earth science is data-intensive, characterized by heterogeneous, sometimes voluminous collections representing phenomena at different scales collected for different purposes and managed in disparate ways. However, much of the earth's surface still requires boots-on-the-ground, in-person fieldwork in order to detect the subtle variations from which humans can infer complex structures and patterns. Nevertheless, field experiences can and should be enabled and enhanced by a variety of emerging technologies. The goal of the proposed research project is to pilot test emerging data integration, semantic and visualization technologies for evaluation of their potential usefulness in the field sciences, particularly in the context of field geology. The proposed project will investigate new techniques for data management and integration enabled by semantic web technologies, along with new techniques for augmented reality that can operate on such integrated data to enable in situ visualization in the field. The research objectives include: Develop new technical infrastructure that applies target technologies to field geology; Test, evaluate, and assess the technical infrastructure in a pilot field site; Evaluate the capabilities of the systems for supporting and augmenting field science; and Assess the generality of the system for implementation in new and different types of field sites. Our hypothesis is that these technologies will enable what we call "field science situational awareness" - a cognitive state formerly attained only through long experience in the field - that is highly desirable but difficult to achieve in time- and resource-limited settings. Expected outcomes include elucidation of how, and in what ways, these technologies are beneficial in the field; enumeration of the steps and requirements to implement these systems; and cost/benefit analyses that evaluate under what conditions the investments of time and resources are advisable to construct

  16. Applying DEA sensitivity analysis to efficiency measurement of Vietnamese universities

    Directory of Open Access Journals (Sweden)

    Thi Thanh Huyen Nguyen

    2015-11-01

    Full Text Available The primary purpose of this study is to measure the technical efficiency of 30 doctorate-granting universities, the universities or the higher education institutes with PhD training programs, in Vietnam, applying the sensitivity analysis of data envelopment analysis (DEA. The study uses eight sets of input-output specifications using the replacement as well as aggregation/disaggregation of variables. The measurement results allow us to examine the sensitivity of the efficiency of these universities with the sets of variables. The findings also show the impact of variables on their efficiency and its “sustainability”.

  17. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  18. Applying Squeezing Technique to Clayrocks: Lessons Learned from Experiments at Mont Terri Rock Laboratory

    International Nuclear Information System (INIS)

    Fernandez, A. M.; Sanchez-Ledesma, D. M.; Tournassat, C.; Melon, A.; Gaucher, E.; Astudillo, E.; Vinsot, A.

    2013-01-01

    Knowledge of the pore water chemistry in clay rock formations plays an important role in determining radionuclide migration in the context of nuclear waste disposal. Among the different in situ and ex-situ techniques for pore water sampling in clay sediments and soils, squeezing technique dates back 115 years. Although different studies have been performed about the reliability and representativeness of squeezed pore waters, more of them were achieved on high porosity, high water content and unconsolidated clay sediments. A very few of them tackled the analysis of squeezed pore water from low-porosity, low water content and highly consolidated clay rocks. In this work, a specially designed and fabricated one-dimensional compression cell two directional fluid flow was used to extract and analyse the pore water composition of Opalinus Clay core samples from Mont Terri (Switzerland). The reproducibility of the technique is good and no ionic ultrafiltration, chemical fractionation or anion exclusion was found in the range of pressures analysed: 70-200 MPa. Pore waters extracted in this range of pressures do not decrease in concentration, which would indicate a dilution of water by mixing of the free pore water and the outer layers of double layer water (Donnan water). A threshold (safety) squeezing pressure of 175 MPa was established for avoiding membrane effects (ion filtering, anion exclusion, etc.) from clay particles induced by increasing pressures. Besides, the pore waters extracted at these pressures are representative of the Opalinus Clay formation from a direct comparison against in situ collected borehole waters. (Author)

  19. Applying Squeezing Technique to Clayrocks: Lessons Learned from Experiments at Mont Terri Rock Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, A. M.; Sanchez-Ledesma, D. M.; Tournassat, C.; Melon, A.; Gaucher, E.; Astudillo, E.; Vinsot, A.

    2013-07-01

    Knowledge of the pore water chemistry in clay rock formations plays an important role in determining radionuclide migration in the context of nuclear waste disposal. Among the different in situ and ex-situ techniques for pore water sampling in clay sediments and soils, squeezing technique dates back 115 years. Although different studies have been performed about the reliability and representativeness of squeezed pore waters, more of them were achieved on high porosity, high water content and unconsolidated clay sediments. A very few of them tackled the analysis of squeezed pore water from low-porosity, low water content and highly consolidated clay rocks. In this work, a specially designed and fabricated one-dimensional compression cell two directional fluid flow was used to extract and analyse the pore water composition of Opalinus Clay core samples from Mont Terri (Switzerland). The reproducibility of the technique is good and no ionic ultrafiltration, chemical fractionation or anion exclusion was found in the range of pressures analysed: 70-200 MPa. Pore waters extracted in this range of pressures do not decrease in concentration, which would indicate a dilution of water by mixing of the free pore water and the outer layers of double layer water (Donnan water). A threshold (safety) squeezing pressure of 175 MPa was established for avoiding membrane effects (ion filtering, anion exclusion, etc.) from clay particles induced by increasing pressures. Besides, the pore waters extracted at these pressures are representative of the Opalinus Clay formation from a direct comparison against in situ collected borehole waters. (Author)

  20. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  1. The simulation of Typhoon-induced coastal inundation in Busan, South Korea applying the downscaling technique

    Science.gov (United States)

    Jang, Dongmin; Park, Junghyun; Yuk, Jin-Hee; Joh, MinSu

    2017-04-01

    Due to typhoons, the south coastal cities including Busan in South Korea coastal are very vulnerable to a surge, wave and corresponding coastal inundation, and are affected every year. In 2016, South Korea suffered tremendous damage by typhoon 'Chaba', which was developed near east-north of Guam on Sep. 28 and had maximum 10-minute sustained wind speed of about 50 m/s, 1-minute sustained wind speed of 75 m/s and a minimum central pressure of 905 hpa. As 'Chaba', which is the strongest since typhoon 'Maemi' in 2003, hit South Korea on Oct. 5, it caused a massive economic and casualty damage to Ulsan, Gyeongju and Busan in South Korea. In particular, the damage of typhoon-induced coastal inundation in Busan, where many high-rise buildings and residential areas are concentrated near coast, was serious. The coastal inundation could be more affected by strong wind-induced wave than surge. In fact, it was observed that the surge height was about 1 m averagely and a significant wave height was about 8 m at coastal sea nearby Busan on Oct. 5 due to 'Chaba'. Even though the typhoon-induced surge elevated the sea level, the typhoon-induced long period wave with wave period of more than 15s could play more important role in the inundation. The present work simulated the coastal inundation induced by 'Chaba' in Busan, South Korea considering the effects of typhoon-induced surge and wave. For 'Chaba' hindcast, high resolution Weather Research and Forecasting model (WRF) was applied using a reanalysis data produced by NCEP (FNL 0.25 degree) on the boundary and initial conditions, and was validated by the observation of wind speed, direction and pressure. The typhoon-induced coastal inundation was simulated by an unstructured gird model, Finite Volume Community Ocean Model (FVCOM), which is fully current-wave coupled model. To simulate the wave-induced inundation, 1-way downscaling technique of multi domain was applied. Firstly, a mother's domain including Korean peninsula was

  2. Applying squeezing technique to clay-rocks: lessons learned from ten years experiments at Mont Terri

    International Nuclear Information System (INIS)

    Fernandez, A. M.; Melon, A.; Sanchez-Ledesma, D.M.; Tournassat, C.; Gaucher, E.; Astudillo, J.; Vinsot, A.

    2012-01-01

    Document available in extended abstract form only. Argillaceous formations of low permeability are considered in several countries as potential host rocks for the disposal of high level radioactive wastes (HLRW). In order to determine their suitability for waste disposal, evaluations of the hydro-geochemistry and transport mechanisms from such geologic formations to the biosphere must be undertaken. The migration of radionuclides through the geosphere will occur predominantly in the aqueous phase, and hence the pore water chemistry plays an important role in determining ion diffusion characteristics in argillaceous formations. Consequently, a great effort has been made to characterise the pore water chemistry in clay-rocks formations. In the last 10 years various techniques were developed for determining pore water composition of clay-rocks including both direct and indirect methods: 1) In situ pore water sampling (water and gas) from sealed boreholes (Pearson et al., 2003; Vinsot et al. 2008); 2) Laboratory pore water sampling from unaltered core samples by the squeezing technique at high pressures (Fernandez et al., 2009); and 3) Characterization of the water chemistry by geochemical modelling (Gaucher et al. 2009). Pore water chemistry in clay-rocks and extraction techniques were documented and reviewed in different studies (Sacchi et al., 2001). Recovering pristine pore water from low permeable and low water content systems is very difficult and sometimes impossible. Besides, uncertainties are associated to each method used for the pore water characterization. In this paper, a review about the high pressure squeezing technique applied to indurate clay-rocks was performed. For this purpose, the experimental work on Opalinus Clay at the Mont Terri Research Laboratory during the last ten years was evaluated. A complete discussion was made about different issues such as: a) why is necessary to obtain the pore water by squeezing in the context of radioactive waste

  3. Applying value stream mapping techniques to eliminate non-value-added waste for the procurement of endovascular stents

    International Nuclear Information System (INIS)

    Teichgräber, Ulf K.; Bucourt, Maximilian de

    2012-01-01

    Objectives: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). Materials and methods: The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. Results: The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. Conclusion: VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system.

  4. Applying value stream mapping techniques to eliminate non-value-added waste for the procurement of endovascular stents.

    Science.gov (United States)

    Teichgräber, Ulf K; de Bucourt, Maximilian

    2012-01-01

    OJECTIVES: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  5. Vibroacoustic Modeling of Mechanically Coupled Structures: Artificial Spring Technique Applied to Light and Heavy Mediums

    Directory of Open Access Journals (Sweden)

    L. Cheng

    1996-01-01

    Full Text Available This article deals with the modeling of vibrating structures immersed in both light and heavy fluids, and possible applications to noise control problems and industrial vessels containing fluids. A theoretical approach, using artificial spring systems to characterize the mechanical coupling between substructures, is extended to include fluid loading. A structure consisting of a plate-ended cylindrical shell and its enclosed acoustic cavity is analyzed. After a brief description of the proposed technique, a number of numerical results are presented. The analysis addresses the following specific issues: the coupling between the plate and the shell; the coupling between the structure and the enclosure; the possibilities and difficulties regarding internal soundproofing through modifications of the joint connections; and the effects of fluid loading on the vibration of the structure.

  6. The ring cycle: an iterative lens reconstruction technique applied to MG1131 + 0456

    International Nuclear Information System (INIS)

    Kochanek, C.S.; Blandford, R.D.; Lawrence, C.R.; Narayan, R.

    1989-01-01

    A new technique is described for the analysis of well-resolved gravitational lens images. This method allows us to solve for the brightness distribution of the unlensed source as well as a parametrized model of the lens. Our algorithm computes a figure of merit for a lens model based on the scatter in the surface brightnesses of image elements that, according to the model, come from the same source element. Minimization of the figure of merit leads to an optimum solution for the source and the lens. We present a successful application of the method to VLA maps of the 'Einstein ring' radio source MG1131 + 0456 observed by previous authors. The inversion gives a normal galaxy-like elliptical potential for the lens and an ordinary double-lobed structure for the background radio source. (author)

  7. Advanced examination techniques applied to the qualification of critical welds for the ITER correction coils

    CERN Document Server

    Sgobba, Stefano; Libeyre, Paul; Marcinek, Dawid Jaroslaw; Piguiet, Aline; Cécillon, Alexandre

    2015-01-01

    The ITER correction coils (CCs) consist of three sets of six coils located in between the toroidal (TF) and poloidal field (PF) magnets. The CCs rely on a Cable-in-Conduit Conductor (CICC), whose supercritical cooling at 4.5 K is provided by helium inlets and outlets. The assembly of the nozzles to the stainless steel conductor conduit includes fillet welds requiring full penetration through the thickness of the nozzle. Static and cyclic stresses have to be sustained by the inlet welds during operation. The entire volume of helium inlet and outlet welds, that are submitted to the most stringent quality levels of imperfections according to standards in force, is virtually uninspectable with sufficient resolution by conventional or computed radiography or by Ultrasonic Testing. On the other hand, X-ray computed tomography (CT) was successfully applied to inspect the full weld volume of several dozens of helium inlet qualification samples. The extensive use of CT techniques allowed a significant progress in the ...

  8. Applying Toyota production system techniques for medication delivery: improving hospital safety and efficiency.

    Science.gov (United States)

    Newell, Terry L; Steinmetz-Malato, Laura L; Van Dyke, Deborah L

    2011-01-01

    The inpatient medication delivery system used at a large regional acute care hospital in the Midwest had become antiquated and inefficient. The existing 24-hr medication cart-fill exchange process with delivery to the patients' bedside did not always provide ordered medications to the nursing units when they were needed. In 2007 the principles of the Toyota Production System (TPS) were applied to the system. Project objectives were to improve medication safety and reduce the time needed for nurses to retrieve patient medications. A multidisciplinary team was formed that included representatives from nursing, pharmacy, informatics, quality, and various operational support departments. Team members were educated and trained in the tools and techniques of TPS, and then designed and implemented a new pull system benchmarking the TPS Ideal State model. The newly installed process, providing just-in-time medication availability, has measurably improved delivery processes as well as patient safety and satisfaction. Other positive outcomes have included improved nursing satisfaction, reduced nursing wait time for delivered medications, and improved efficiency in the pharmacy. After a successful pilot on two nursing units, the system is being extended to the rest of the hospital. © 2010 National Association for Healthcare Quality.

  9. The Study of Mining Activities and their Influences in the Almaden Region Applying Remote Sensing Techniques

    International Nuclear Information System (INIS)

    Rico, C.; Schmid, T.; Millan, R.; Gumuzzio, J.

    2010-01-01

    This scientific-technical report is a part of an ongoing research work carried out by Celia Rico Fraile in order to obtain the Diploma of Advanced Studies as part of her PhD studies. This work has been developed in collaboration with the Faculty of Science at The Universidad Autonoma de Madrid and the Department of Environment at CIEMAT. The main objective of this work was the characterization and classification of land use in Almaden (Ciudad Real) during cinnabar mineral exploitation and after mining activities ceased in 2002, developing a methodology focused on the integration of remote sensing techniques applying multispectral and hyper spectral satellite data. By means of preprocessing and processing of data from the satellite images as well as data obtained from field campaigns, a spectral library was compiled in order to obtain representative land surfaces within the study area. Monitoring results show that the distribution of areas affected by mining activities is rapidly diminishing in recent years. (Author) 130 refs

  10. Inverting travel times with a triplication. [spline fitting technique applied to lunar seismic data reduction

    Science.gov (United States)

    Jarosch, H. S.

    1982-01-01

    A method based on the use of constrained spline fits is used to overcome the difficulties arising when body-wave data in the form of T-delta are reduced to the tau-p form in the presence of cusps. In comparison with unconstrained spline fits, the method proposed here tends to produce much smoother models which lie approximately in the middle of the bounds produced by the extremal method. The method is noniterative and, therefore, computationally efficient. The method is applied to the lunar seismic data, where at least one triplication is presumed to occur in the P-wave travel-time curve. It is shown, however, that because of an insufficient number of data points for events close to the antipode of the center of the lunar network, the present analysis is not accurate enough to resolve the problem of a possible lunar core.

  11. Different techniques of multispectral data analysis for vegetation fraction retrieval

    Science.gov (United States)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  12. Diffusing wave spectroscopy applied to material analysis and process control

    International Nuclear Information System (INIS)

    Lloyd, Christopher James

    1997-01-01

    Diffusing Wave Spectroscopy (DWS) was studied as a method of laboratory analysis of sub-micron particles, and developed as a prospective in-line, industrial, process control sensor, capable of near real-time feedback. No sample pre-treatment was required and measurement was via a non-invasive, flexible, dip in probe. DWS relies on the concept of the diffusive migration of light, as opposed to the ballistic scatter model used in conventional dynamic light scattering. The specific requirements of the optoelectronic hardware, data analysis methods and light scattering model were studied experimentally and, where practical, theoretically resulting in a novel technique of analysis of particle suspensions and emulsions of volume fractions between 0.01 and 0.4. Operation at high concentrations made the technique oblivious to dust and contamination. A pure homodyne (autodyne) experimental arrangement described was resilient to environmental disturbances, unlike many other systems which utilise optical fibres or heterodyne operation. Pilot and subsequent prototype development led to a highly accurate method of size ranking, suitable for analysis of a wide range of suspensions and emulsions. The technique was shown to operate on real industrial samples with statistical variance as low as 0.3% with minimal software processing. Whilst the application studied was the analysis of TiO 2 suspensions, a diverse range of materials including polystyrene beads, cell pastes and industrial cutting fluid emulsions were tested. Results suggest that, whilst all sizing should be comparative to suitable standards, concentration effects may be minimised and even completely modelled-out in many applications. Adhesion to the optical probe was initially a significant problem but was minimised after the evaluation and use of suitable non stick coating materials. Unexpected behaviour in the correlation in the region of short decay times led to consideration of the effects of rotational diffusion

  13. Coldness applied to plastic engineering techniques and rooms; Le froid applique aux techniques de la plasturgie et a ses locaux

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This technical dossier is the result of a collaboration between the CFE, EdF Industrie and the French federation of plastic engineering. It aims at answering all questions relative to plastic materials processing: 1 - general study on the economical aspects of plastic engineering, plastic materials, and manufacturing processes; 2 - the different cold processing techniques (air cooling and refrigerating systems); 3 - the main transformation processes for thermo-plastic materials and the advantage of cooling techniques; 4 - the environmental conditioning of rooms (clean rooms); 5 - examples of realizations. (J.S.)

  14. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.

    2002-01-01

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  15. Analysis of concrete beams using applied element method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  16. Apply Functional Modelling to Consequence Analysis in Supervision Systems

    DEFF Research Database (Denmark)

    Zhang, Xinxin; Lind, Morten; Gola, Giulio

    2013-01-01

    This paper will first present the purpose and goals of applying functional modelling approach to consequence analysis by adopting Multilevel Flow Modelling (MFM). MFM Models describe a complex system in multiple abstraction levels in both means-end dimension and whole-part dimension. It contains...... consequence analysis to practical or online applications in supervision systems. It will also suggest a multiagent solution as the integration architecture for developing tools to facilitate the utilization results of functional consequence analysis. Finally a prototype of the multiagent reasoning system...... causal relations between functions and goals. A rule base system can be developed to trace the causal relations and perform consequence propagations. This paper will illustrate how to use MFM for consequence reasoning by using rule base technology and describe the challenges for integrating functional...

  17. Analysis of Brick Masonry Wall using Applied Element Method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  18. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  19. Investigation about the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils

    Directory of Open Access Journals (Sweden)

    Adriano Pinto Mariano

    2009-10-01

    Full Text Available This work investigated the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils collected at three service stations. Batch biodegradation experiments were carried out in Bartha biometer flasks (250 mL used to measure the microbial CO2 production. Biodegradation efficiency was also measured by quantifying the concentration of hydrocarbons. In addition to the biodegradation experiments, the capability of the studied cultures and the native microorganisms to biodegrade the diesel oil purchased from a local service station, was verified using a technique based on the redox indicator 2,6 -dichlorophenol indophenol (DCPIP. Results obtained with this test showed that the inocula used in the biodegradation experiments were able to degrade the diesel oil and the tests carried out with the native microorganisms indicated that these soils had a microbiota adapted to degrade the hydrocarbons. In general, no gain was obtained with the addition of microorganisms or even negative effects were observed in the biodegradation experiments.Este trabalho investigou a eficiência da técnica do bioaumento quando aplicada a solos contaminados com óleo diesel coletados em três postos de combustíveis. Experimentos de biodegradação foram realizados em frascos de Bartha (250 mL, usados para medir a produção microbiana de CO2. A eficiência de biodegradação também foi quantificada pela concentração de hidrocarbonetos. Conjuntamente aos experimentos de biodegradação, a capacidade das culturas estudadas e dos microrganismos nativos em biodegradar óleo diesel comprado de um posto de combustíveis local, foi verificada utilizando-se a técnica baseada no indicador redox 2,6 - diclorofenol indofenol (DCPIP. Resultados obtidos com esse teste mostraram que os inóculos empregados nos experimentos de biodegradação foram capazes de biodegradar óleo diesel e os testes com os microrganismos nativos indicaram que estes solos

  20. Prompt gamma cold neutron activation analysis applied to biological materials

    International Nuclear Information System (INIS)

    Rossbach, M.; Hiep, N.T.

    1992-01-01

    Cold neutrons at the external neutron guide laboratory (ELLA) of the KFA Juelich are used to demonstrate their profitable application for multielement characterization of biological materials. The set-up and experimental conditions of the Prompt Gamma Cold Neutron Activation Analysis (PGCNAA) device is described in detail. Results for C, H, N, S, K, B, and Cd using synthetic standards and the 'ratio' technique for calculation are reported for several reference materials and prove the method to be reliable and complementary with respect to the elements being determined by INAA. (orig.)

  1. A comparison of new, old and future densiometic techniques as applied to volcanologic study.

    Science.gov (United States)

    Pankhurst, Matthew; Moreland, William; Dobson, Kate; Þórðarson, Þorvaldur; Fitton, Godfrey; Lee, Peter

    2015-04-01

    The density of any material imposes a primary control upon its potential or actual physical behaviour in relation to its surrounds. It follows that a thorough understanding of the physical behaviour of dynamic, multi-component systems, such as active volcanoes, requires knowledge of the density of each component. If we are to accurately predict the physical behaviour of synthesized or natural volcanic systems, quantitative densiometric measurements are vital. The theoretical density of melt, crystals and bubble phases may be calculated using composition, structure, temperature and pressure inputs. However, measuring the density of natural, non-ideal, poly-phase materials remains problematic, especially if phase specific measurement is important. Here we compare three methods; Archimedes principle, He-displacement pycnometry and X-ray micro computed tomography (XMT) and discuss the utility and drawbacks of each in the context of modern volcanologic study. We have measured tephra, ash and lava from the 934 AD Eldgjá eruption (Iceland), and the 2010 AD Eyjafjallajökull eruption (Iceland), using each technique. These samples exhibit a range of particle sizes, phases and textures. We find that while the Archimedes method remains a useful, low-cost technique to generate whole-rock density data, relative precision is problematic at small particles sizes. Pycnometry offers a more precise whole-rock density value, at a comparable cost-per-sample. However, this technique is based upon the assumption pore spaces within the sample are equally available for gas exchange, which may or may not be the case. XMT produces 3D images, at resolutions from nm to tens of µm per voxel where X-ray attenuation is a qualitative measure of relative electron density, expressed as greyscale number/brightness (usually 16-bit). Phases and individual particles can be digitally segmented according to their greyscale and other characteristics. This represents a distinct advantage over both

  2. Multidisciplinary Design Techniques Applied to Conceptual Aerospace Vehicle Design. Ph.D. Thesis Final Technical Report

    Science.gov (United States)

    Olds, John Robert; Walberg, Gerald D.

    1993-01-01

    Multidisciplinary design optimization (MDO) is an emerging discipline within aerospace engineering. Its goal is to bring structure and efficiency to the complex design process associated with advanced aerospace launch vehicles. Aerospace vehicles generally require input from a variety of traditional aerospace disciplines - aerodynamics, structures, performance, etc. As such, traditional optimization methods cannot always be applied. Several multidisciplinary techniques and methods were proposed as potentially applicable to this class of design problem. Among the candidate options are calculus-based (or gradient-based) optimization schemes and parametric schemes based on design of experiments theory. A brief overview of several applicable multidisciplinary design optimization methods is included. Methods from the calculus-based class and the parametric class are reviewed, but the research application reported focuses on methods from the parametric class. A vehicle of current interest was chosen as a test application for this research. The rocket-based combined-cycle (RBCC) single-stage-to-orbit (SSTO) launch vehicle combines elements of rocket and airbreathing propulsion in an attempt to produce an attractive option for launching medium sized payloads into low earth orbit. The RBCC SSTO presents a particularly difficult problem for traditional one-variable-at-a-time optimization methods because of the lack of an adequate experience base and the highly coupled nature of the design variables. MDO, however, with it's structured approach to design, is well suited to this problem. The result of the application of Taguchi methods, central composite designs, and response surface methods to the design optimization of the RBCC SSTO are presented. Attention is given to the aspect of Taguchi methods that attempts to locate a 'robust' design - that is, a design that is least sensitive to uncontrollable influences on the design. Near-optimum minimum dry weight solutions are

  3. Study for applying microwave power saturation technique on fingernail/EPR dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byeong Ryong; Choi, Hoon; Nam, Hyun Ill; Lee, Byung Ill [Radiation Health Research Institute, Seoul (Korea, Republic of)

    2012-10-15

    There is growing recognition worldwide of the need to develop effective uses of dosimetry methods to assess unexpected exposure to radiation in the event of a large scale event. One of physically based dosimetry methods electron paramagnetic resonance (EPR) spectroscopy has been applied to perform retrospective radiation dosimetry using extracted samples of tooth enamel and nail(fingernail and toenail), following radiation accidents and exposures resulting from weapon use, testing, and production. Human fingernails are composed largely of a keratin, which consists of {alpha} helical peptide chains that are twisted into a left handed coil and strengthened by disulphide cross links. Ionizing radiation generates free radicals in the keratin matrix, and these radicals are stable over a relatively long period (days to weeks). Most importantly, the number of radicals is proportional to the magnitude of the dose over a wide dose range (0{approx}30 Gy). Also, dose can be estimated at four different locations on the human body, providing information on the homogeneity of the radiation exposure. And The results from EPR nail dosimetry are immediately available However, relatively large background signal (BKS) converted from mechanically induced signal (MIS) after cutting process of fingernail, normally overlaps with the radiation induced signal (RIS), make it difficult to estimate accurate dose accidental exposure. Therefore, estimation method using dose response curve was difficult to ensure reliability below 5 Gy. In this study, In order to overcome these disadvantages, we measured the reactions of RIS and BKS (MIS) according to the change of Microwave power level, and researched about the applicability of the Power saturation technique at low dose.

  4. Plasma-based techniques applied to the determination of metals and metalloids in atmospheric aerosols

    International Nuclear Information System (INIS)

    Smichowski, Patricia

    2011-01-01

    Full text: This lecture presents an overview of the research carried out by our group during the last decade on the determination of metals, metalloids, ions and species in atmospheric aerosols and related matrices using plasma-based techniques. In our first studies we explored the application of a size fractionation procedure and the subsequent determination of minor, major and trace elements in samples of deposited particles collected one day after the eruption of the Copahue Volcano, located in the Chile-Argentina border to assess the content of relevant elements with respect of the environment and the local population health. We employed a multi-technique approach (ICP-MS, XRD and NAA) to gain complete information of the characteristics of the sample. In addition to the study of ashes emitted for natural sources we also studied ashes of anthropogenic origin such as those arising from coal combustion in thermal power plants. For estimating the behavior and fate of elements in atmospheric particles and ashes we applied in this case a chemical fractionation procedure in order to establish the distribution of many elements amongst soluble, bound to carbonates, bound to oxides and bound to organic matter and environmental immobile fraction. Studies on the air quality of the mega-city of Buenos Aires were scarce and fragmentary and our objective was, and still is, to contribute to clarify key issues related to levels of crustal, toxic and potentially toxic elements in this air basin. Our findings were compared with average concentrations of metals and metalloids with results reported for other Latin American cities such as Sao Paulo, Mexico and Santiago de Chile. In this context, a series of studies were carried out since 2004 considering different sampling strategies to reflect local aspects of air pollution sources. In the last years, our interest was focused on the levels of traffic-related elements in the urban atmosphere. We have contributed with the first data

  5. Plasma-based techniques applied to the determination of metals and metalloids in atmospheric aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Smichowski, Patricia, E-mail: smichows@cnea.gov.ar [Comision Nacional de Energia Atomica, Gerencia Quimica, Pcia de Buenos Aires (Argentina)

    2011-07-01

    Full text: This lecture presents an overview of the research carried out by our group during the last decade on the determination of metals, metalloids, ions and species in atmospheric aerosols and related matrices using plasma-based techniques. In our first studies we explored the application of a size fractionation procedure and the subsequent determination of minor, major and trace elements in samples of deposited particles collected one day after the eruption of the Copahue Volcano, located in the Chile-Argentina border to assess the content of relevant elements with respect of the environment and the local population health. We employed a multi-technique approach (ICP-MS, XRD and NAA) to gain complete information of the characteristics of the sample. In addition to the study of ashes emitted for natural sources we also studied ashes of anthropogenic origin such as those arising from coal combustion in thermal power plants. For estimating the behavior and fate of elements in atmospheric particles and ashes we applied in this case a chemical fractionation procedure in order to establish the distribution of many elements amongst soluble, bound to carbonates, bound to oxides and bound to organic matter and environmental immobile fraction. Studies on the air quality of the mega-city of Buenos Aires were scarce and fragmentary and our objective was, and still is, to contribute to clarify key issues related to levels of crustal, toxic and potentially toxic elements in this air basin. Our findings were compared with average concentrations of metals and metalloids with results reported for other Latin American cities such as Sao Paulo, Mexico and Santiago de Chile. In this context, a series of studies were carried out since 2004 considering different sampling strategies to reflect local aspects of air pollution sources. In the last years, our interest was focused on the levels of traffic-related elements in the urban atmosphere. We have contributed with the first data

  6. Statistical analysis and Kalman filtering applied to nuclear materials accountancy

    International Nuclear Information System (INIS)

    Annibal, P.S.

    1990-08-01

    Much theoretical research has been carried out on the development of statistical methods for nuclear material accountancy. In practice, physical, financial and time constraints mean that the techniques must be adapted to give an optimal performance in plant conditions. This thesis aims to bridge the gap between theory and practice, to show the benefits to be gained from a knowledge of the facility operation. Four different aspects are considered; firstly, the use of redundant measurements to reduce the error on the estimate of the mass of heavy metal in an 'accountancy tank' is investigated. Secondly, an analysis of the calibration data for the same tank is presented, establishing bounds for the error and suggesting a means of reducing them. Thirdly, a plant-specific method of producing an optimal statistic from the input, output and inventory data, to help decide between 'material loss' and 'no loss' hypotheses, is developed and compared with existing general techniques. Finally, an application of the Kalman Filter to materials accountancy is developed, to demonstrate the advantages of state-estimation techniques. The results of the analyses and comparisons illustrate the importance of taking into account a complete and accurate knowledge of the plant operation, measurement system, and calibration methods, to derive meaningful results from statistical tests on materials accountancy data, and to give a better understanding of critical random and systematic error sources. The analyses were carried out on the head-end of the Fast Reactor Reprocessing Plant, where fuel from the prototype fast reactor is cut up and dissolved. However, the techniques described are general in their application. (author)

  7. Applied research and development of neutron activation analysis

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Baek, Sung Ryel; Kim, Young Gi; Jung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun; Lim, Jong Myoung

    2003-05-01

    The aims of this project are to establish the quality control system of Neutron Activation Analysis(NAA) due to increase of industrial needs for standard analytical method and to prepare and identify the standard operation procedure of NAA through practical testing for different analytical items. R and D implementations of analytical quality system using neutron irradiation facility and gamma-ray measurement system and automation of NAA facility in HANARO research reactor are as following ; 1) Establishment of NAA quality control system for the maintenance of best measurement capability and the promotion of utilization of HANARO research reactor 2) Improvement of analytical sensitivity for industrial applied technologies and establishment of certified standard procedures 3) Standardization and development of Prompt Gamma-ray Activation Analysis (PGAA) technology

  8. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  9. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  10. Digital Image Correlation Techniques Applied to Large Scale Rocket Engine Testing

    Science.gov (United States)

    Gradl, Paul R.

    2016-01-01

    Rocket engine hot-fire ground testing is necessary to understand component performance, reliability and engine system interactions during development. The J-2X upper stage engine completed a series of developmental hot-fire tests that derived performance of the engine and components, validated analytical models and provided the necessary data to identify where design changes, process improvements and technology development were needed. The J-2X development engines were heavily instrumented to provide the data necessary to support these activities which enabled the team to investigate any anomalies experienced during the test program. This paper describes the development of an optical digital image correlation technique to augment the data provided by traditional strain gauges which are prone to debonding at elevated temperatures and limited to localized measurements. The feasibility of this optical measurement system was demonstrated during full scale hot-fire testing of J-2X, during which a digital image correlation system, incorporating a pair of high speed cameras to measure three-dimensional, real-time displacements and strains was installed and operated under the extreme environments present on the test stand. The camera and facility setup, pre-test calibrations, data collection, hot-fire test data collection and post-test analysis and results are presented in this paper.

  11. Imaging techniques applied to the study of fluids in porous media

    Energy Technology Data Exchange (ETDEWEB)

    Tomutsa, L.; Doughty, D.; Mahmood, S.; Brinkmeyer, A.; Madden, M.P.

    1991-01-01

    A detailed understanding of rock structure and its influence on fluid entrapment, storage capacity, and flow behavior can improve the effective utilization and design of methods to increase the recovery of oil and gas from petroleum reservoirs. The dynamics of fluid flow and trapping phenomena in porous media was investigated. Miscible and immiscible displacement experiments in heterogeneous Berea and Shannon sandstone samples were monitored using X-ray computed tomography (CT scanning) to determine the effect of heterogeneities on fluid flow and trapping. The statistical analysis of pore and pore throat sizes in thin sections cut from these sandstone samples enabled the delineation of small-scale spatial distributions of porosity and permeability. Multiphase displacement experiments were conducted with micromodels constructed using thin slabs of the sandstones. The combination of the CT scanning, thin section, and micromodel techniques enables the investigation of how variations in pore characteristics influence fluid front advancement, fluid distributions, and fluid trapping. Plugs cut from the sandstone samples were investigated using high resolution nuclear magnetic resonance imaging permitting the visualization of oil, water or both within individual pores. The application of these insights will aid in the proper interpretation of relative permeability, capillary pressure, and electrical resistivity data obtained from whole core studies. 7 refs., 14 figs., 2 tabs.

  12. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  13. Applying Geospatial Techniques to Investigate Boundary Layer Land-Atmosphere Interactions Involved in Tornadogensis

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Knupp, K. R.; Molthan, A.; Coleman, T.

    2017-12-01

    Northern Alabama is among the most tornado-prone regions in the United States. This region has a higher degree of spatial variability in both terrain and land cover than the more frequently studied North American Great Plains region due to its proximity to the southern Appalachian Mountains and Cumberland Plateau. More research is needed to understand North Alabama's high tornado frequency and how land surface heterogeneity influences tornadogenesis in the boundary layer. Several modeling and simulation studies stretching back to the 1970's have found that variations in the land surface induce tornadic-like flow near the surface, illustrating a need for further investigation. This presentation introduces research investigating the hypothesis that horizontal gradients in land surface roughness, normal to the direction of flow in the boundary layer, induce vertically oriented vorticity at the surface that can potentially aid in tornadogenesis. A novel approach was implemented to test this hypothesis using a GIS-based quadrant pattern analysis method. This method was developed to quantify spatial relationships and patterns between horizontal variations in land surface roughness and locations of tornadogenesis. Land surface roughness was modeled using the Noah land surface model parameterization scheme which, was applied to MODIS 500 m and Landsat 30 m data in order to compare the relationship between tornadogenesis locations and roughness gradients at different spatial scales. This analysis found a statistical relationship between areas of higher roughness located normal to flow surrounding tornadogenesis locations that supports the tested hypothesis. In this presentation, the innovative use of satellite remote sensing data and GIS technologies to address interactions between the land and atmosphere will be highlighted.

  14. Characteristic vector analysis of inflection ratio spectra: New technique for analysis of ocean color data

    Science.gov (United States)

    Grew, G. W.

    1985-01-01

    Characteristic vector analysis applied to inflection ratio spectra is a new approach to analyzing spectral data. The technique applied to remote data collected with the multichannel ocean color sensor (MOCS), a passive sensor, simultaneously maps the distribution of two different phytopigments, chlorophyll alpha and phycoerythrin, the ocean. The data set presented is from a series of warm core ring missions conducted during 1982. The data compare favorably with a theoretical model and with data collected on the same mission by an active sensor, the airborne oceanographic lidar (AOL).

  15. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  16. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  17. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  18. MCNP perturbation technique for criticality analysis

    International Nuclear Information System (INIS)

    McKinney, G.W.; Iverson, J.L.

    1995-01-01

    The differential operator perturbation technique has been incorporated into the Monte Carlo N-Particle transport code MCNP and will become a standard feature of future releases. This feature includes first and/or second order terms of the Taylor Series expansion for response perturbations related to cross-section data (i.e., density, composition, etc.). Criticality analyses can benefit from this technique in that predicted changes in the track-length tally estimator of K eff may be obtained for multiple perturbations in a single run. A key advantage of this method is that a precise estimate of a small change in response (i.e., < 1%) is easily obtained. This technique can also offer acceptable accuracy, to within a few percent, for up to 20-30% changes in a response

  19. Applying data mining techniques to medical time series: an empirical case study in electroencephalography and stabilometry

    Directory of Open Access Journals (Sweden)

    A. Anguera

    2016-01-01

    This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques.

  20. Data Analysis Techniques for Physical Scientists

    Science.gov (United States)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  1. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  2. Final Aperture Superposition Technique applied to fast calculation of electron output factors and depth dose curves

    International Nuclear Information System (INIS)

    Faddegon, B.A.; Villarreal-Barajas, J.E.

    2005-01-01

    The Final Aperture Superposition Technique (FAST) is described and applied to accurate, near instantaneous calculation of the relative output factor (ROF) and central axis percentage depth dose curve (PDD) for clinical electron beams used in radiotherapy. FAST is based on precalculation of dose at select points for the two extreme situations of a fully open final aperture and a final aperture with no opening (fully shielded). This technique is different than conventional superposition of dose deposition kernels: The precalculated dose is differential in position of the electron or photon at the downstream surface of the insert. The calculation for a particular aperture (x-ray jaws or MLC, insert in electron applicator) is done with superposition of the precalculated dose data, using the open field data over the open part of the aperture and the fully shielded data over the remainder. The calculation takes explicit account of all interactions in the shielded region of the aperture except the collimator effect: Particles that pass from the open part into the shielded part, or visa versa. For the clinical demonstration, FAST was compared to full Monte Carlo simulation of 10x10,2.5x2.5, and 2x8 cm 2 inserts. Dose was calculated to 0.5% precision in 0.4x0.4x0.2 cm 3 voxels, spaced at 0.2 cm depth intervals along the central axis, using detailed Monte Carlo simulation of the treatment head of a commercial linear accelerator for six different electron beams with energies of 6-21 MeV. Each simulation took several hours on a personal computer with a 1.7 Mhz processor. The calculation for the individual inserts, done with superposition, was completed in under a second on the same PC. Since simulations for the pre calculation are only performed once, higher precision and resolution can be obtained without increasing the calculation time for individual inserts. Fully shielded contributions were largest for small fields and high beam energy, at the surface, reaching a maximum

  3. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  4. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  5. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  6. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  7. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  8. Applying Authentic Data Analysis in Learning Earth Atmosphere

    Science.gov (United States)

    Johan, H.; Suhandi, A.; Samsudin, A.; Wulan, A. R.

    2017-09-01

    The aim of this research was to develop earth science learning material especially earth atmosphere supported by science research with authentic data analysis to enhance reasoning through. Various earth and space science phenomenon require reasoning. This research used experimental research with one group pre test-post test design. 23 pre-service physics teacher participated in this research. Essay test was conducted to get data about reason ability. Essay test was analyzed quantitatively. Observation sheet was used to capture phenomena during learning process. The results showed that student’s reasoning ability improved from unidentified and no reasoning to evidence based reasoning and inductive/deductive rule-based reasoning. Authentic data was considered using Grid Analysis Display System (GrADS). Visualization from GrADS facilitated students to correlate the concepts and bring out real condition of nature in classroom activity. It also helped student to reason the phenomena related to earth and space science concept. It can be concluded that applying authentic data analysis in learning process can help to enhance students reasoning. This study is expected to help lecture to bring out result of geoscience research in learning process and facilitate student understand concepts.

  9. Two-dimensional DFA scaling analysis applied to encrypted images

    Science.gov (United States)

    Vargas-Olmos, C.; Murguía, J. S.; Ramírez-Torres, M. T.; Mejía Carlos, M.; Rosu, H. C.; González-Aguilar, H.

    2015-01-01

    The technique of detrended fluctuation analysis (DFA) has been widely used to unveil scaling properties of many different signals. In this paper, we determine scaling properties in the encrypted images by means of a two-dimensional DFA approach. To carry out the image encryption, we use an enhanced cryptosystem based on a rule-90 cellular automaton and we compare the results obtained with its unmodified version and the encryption system AES. The numerical results show that the encrypted images present a persistent behavior which is close to that of the 1/f-noise. These results point to the possibility that the DFA scaling exponent can be used to measure the quality of the encrypted image content.

  10. Neutron activation analysis applied to nutritional and foodstuff studies

    International Nuclear Information System (INIS)

    Maihara, Vera A.; Santos, Paola S.; Moura, Patricia L.C.; Castro, Lilian P. de; Avegliano, Roseane P.

    2009-01-01

    Neutron Activation Analysis, NAA, has been successfully used on a regularly basis in several areas of nutrition and foodstuffs. NAA has become an important and useful research tool due to the methodology's advantages. These include high accuracy, small quantities of samples and no chemical treatment. This technique allows the determination of important elements directly related to human health. NAA also provides data concerning essential and toxic concentrations in foodstuffs and specific diets. In this paper some studies in the area of nutrition which have been carried out at the Neutron Activation Laboratory of IPEN/CNEN-SP will be presented: a Brazilian total diet study: nutritional element dietary intakes of Sao Paulo state population; a study of trace element in maternal milk and the determination of essential trace elements in some edible mushrooms. (author)

  11. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  12. Quantitative thoracic CT techniques in adults: can they be applied in the pediatric population?

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Soon Ho [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Goo, Jin Mo [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Seoul National University College of Medicine, Cancer Research Institute, Jongno-gu, Seoul (Korea, Republic of); Goo, Hyun Woo [University of Ulsan College of Medicine, Department of Radiology and Research Institute of Radiology, Asan Medical Center, Seoul (Korea, Republic of)

    2013-03-15

    With the rapid evolution of the multidetector row CT technique, quantitative CT has started to be used in clinical studies for revealing a heterogeneous entity of airflow limitation in chronic obstructive pulmonary disease that is caused by a combination of lung parenchymal destruction and remodeling of the small airways in adults. There is growing evidence of a good correlation between quantitative CT findings and pathological findings, pulmonary function test results and other clinical parameters. This article provides an overview of current quantitative thoracic CT techniques used in adults, and how to translate these CT techniques to the pediatric population. (orig.)

  13. Applied measuring techniques for the investigation of time-dependent flow phenomena in centrifugal compressors

    International Nuclear Information System (INIS)

    Hass, U.; Haupt, U.; Jansen, M.; Kassens, K.; Knapp, P.; Rautenberg, M.

    1978-01-01

    During the past 10 years new measuring techniques have been developed for the experimental investigation of highly loaded centrifugal compressors. These measuring techniques take into account the time dependency of the fluctuating physical quantities such as pressure, temperature, and velocity. Some key points of these experimental techniques are shown and explained in this paper. An important basis for such measurements is the accurate dynamic calibration of the measuring apparatus. In addition, some problems involved analyzing measured signals are dealt with and pressure measurements and their interpretation are shown. Finally optical, acoustical and vibrational measuring procedures are described which are additionally used for the investigation of non-stationary flow phenomena. (orig.) [de

  14. Neutron Filter Technique and its use for Fundamental and applied Investigations

    International Nuclear Information System (INIS)

    Gritzay, V.; Kolotyi, V.

    2008-01-01

    At Kyiv Research Reactor (KRR) the neutron filtered beam technique is used for more than 30 years and its development continues, the new and updated facilities for neutron cross section measurements provide the receipt of neutron cross sections with rather high accuracy: total neutron cross sections with accuracy 1% and better, neutron scattering cross sections with 3-6% accuracy. The main purpose of this paper is presentation of the neutron measurement techniques, developed at KRR, and demonstration some experimental results, obtained using these techniques

  15. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  16. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  17. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  18. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  19. Applying of Reliability Techniques and Expert Systems in Management of Radioactive Accidents

    International Nuclear Information System (INIS)

    Aldaihan, S.; Alhbaib, A.; Alrushudi, S.; Karazaitri, C.

    1998-01-01

    Accidents including radioactive exposure have variety of nature and size. This makes such accidents complex situations to be handled by radiation protection agencies or any responsible authority. The situations becomes worse with introducing advanced technology with high complexity that provide operator huge information about system working on. This paper discusses the application of reliability techniques in radioactive risk management. Event tree technique from nuclear field is described as well as two other techniques from nonnuclear fields, Hazard and Operability and Quality Function Deployment. The objective is to show the importance and the applicability of these techniques in radiation risk management. Finally, Expert Systems in the field of accidents management are explored and classified upon their applications

  20. Analysis of Defective Pipings in Nuclear Power Plants and Applications of Guided Ultrasonic Wave Techniques

    International Nuclear Information System (INIS)

    Koo, Dae Seo; Cheong, Yong Moo; Jung, Hyun Kyu; Park, Chi Seung; Park, Jae Suck; Choi, H. R.; Jung, S. S.

    2006-07-01

    In order to apply the guided ultrasonic techniques to the pipes in nuclear power plants, the cases of defective pipes of nuclear power plants, were investigated. It was confirmed that geometric factors of pipes, such as location, shape, and allowable space were impertinent for the application of guided ultrasonic techniques to pipes of nuclear power plants. The quality of pipes, supports, signals analysis of weldment/defects, acquisition of accurate defects signals also make difficult to apply the guided ultrasonic techniques to pipes of nuclear power plants. Thus, a piping mock-up representing the pipes in the nuclear power plants were designed and fabricated. The artificial flaws will be fabricated on the piping mock-up. The signals of guided ultrasonic waves from the artificial flaws will be analyzed. The guided ultrasonic techniques will be applied to the inspection of pipes of nuclear power plants according to the basis of signals analysis of artificial flaws in the piping mock-up

  1. Multivariate calibration applied to the quantitative analysis of infrared spectra

    Energy Technology Data Exchange (ETDEWEB)

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  2. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    Directory of Open Access Journals (Sweden)

    Árpád Gyéresi

    2013-02-01

    Full Text Available Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  3. Applying machine learning and image feature extraction techniques to the problem of cerebral aneurysm rupture

    Directory of Open Access Journals (Sweden)

    Steren Chabert

    2017-01-01

    Full Text Available Cerebral aneurysm is a cerebrovascular disorder characterized by a bulging in a weak area in the wall of an artery that supplies blood to the brain. It is relevant to understand the mechanisms leading to the apparition of aneurysms, their growth and, more important, leading to their rupture. The purpose of this study is to study the impact on aneurysm rupture of the combination of different parameters, instead of focusing on only one factor at a time as is frequently found in the literature, using machine learning and feature extraction techniques. This discussion takes relevance in the context of the complex decision that the physicians have to take to decide which therapy to apply, as each intervention bares its own risks, and implies to use a complex ensemble of resources (human resources, OR, etc. in hospitals always under very high work load. This project has been raised in our actual working team, composed of interventional neuroradiologist, radiologic technologist, informatics engineers and biomedical engineers, from Valparaiso public Hospital, Hospital Carlos van Buren, and from Universidad de Valparaíso – Facultad de Ingeniería and Facultad de Medicina. This team has been working together in the last few years, and is now participating in the implementation of an “interdisciplinary platform for innovation in health”, as part of a bigger project leaded by Universidad de Valparaiso (PMI UVA1402. It is relevant to emphasize that this project is made feasible by the existence of this network between physicians and engineers, and by the existence of data already registered in an orderly manner, structured and recorded in digital format. The present proposal arises from the description in nowadays literature that the actual indicators, whether based on morphological description of the aneurysm, or based on characterization of biomechanical factor or others, these indicators were shown not to provide sufficient information in order

  4. Visualization techniques for malware behavior analysis

    Science.gov (United States)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  5. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    domain or in the frequency domain. However their .... computer to speech analysis led to important elaborations ... tool for the estimation of formant trajectory (10), ... prediction Linear prediction In effect determines the filter .... Radio Res. Lab.

  6. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  7. Applying Conjoint Analysis to Study Attitudes of Thai Government Organisations

    Directory of Open Access Journals (Sweden)

    Natee Suriyanon

    2012-11-01

    Full Text Available This article presents the application of choice-based conjointanalysis to analyse the attitude of Thai government organisationstowards the restriction of the contractor’s right to claimcompensation for unfavourable effects from undesirable events.The analysis reveals that the organisations want to restrict only 6out of 14 types of the claiming rights that were studied. The rightthat they want to restrict most is the right to claim for additionaldirect costs due to force majeure. They are willing to pay between0.087% - 0.210% of the total project direct cost for restricting eachtype of contractor right. The total additional cost for restrictingall six types of rights that the organisations are willing to pay is0.882%. The last section of this article applies the knowledgegained from a choice based conjoint analysis experiment to theanalysis of the standard contract of the Thai government. Theanalysis reveals three types of rights where Thai governmentorganisations are willing to forego restrictions, but the presentstandard contract does not grant such rights.

  8. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  9. Water spray cooling technique applied on a photovoltaic panel: The performance response

    International Nuclear Information System (INIS)

    Nižetić, S.; Čoko, D.; Yadav, A.; Grubišić-Čabo, F.

    2016-01-01

    Highlights: • An experimental study was conducted on a monocrystalline photovoltaic panel (PV). • A water spray cooling technique was implemented to determine PV panel response. • The experimental results showed favorable cooling effect on the panel performance. • A feasibility aspect of the water spray cooling technique was also proven. - Abstract: This paper presents an alternative cooling technique for photovoltaic (PV) panels that includes a water spray application over panel surfaces. An alternative cooling technique in the sense that both sides of the PV panel were cooled simultaneously, to investigate the total water spray cooling effect on the PV panel performance in circumstances of peak solar irradiation levels. A specific experimental setup was elaborated in detail and the developed cooling system for the PV panel was tested in a geographical location with a typical Mediterranean climate. The experimental result shows that it is possible to achieve a maximal total increase of 16.3% (effective 7.7%) in electric power output and a total increase of 14.1% (effective 5.9%) in PV panel electrical efficiency by using the proposed cooling technique in circumstances of peak solar irradiation. Furthermore, it was also possible to decrease panel temperature from an average 54 °C (non-cooled PV panel) to 24 °C in the case of simultaneous front and backside PV panel cooling. Economic feasibility was also determined for of the proposed water spray cooling technique, where the main advantage of the analyzed cooling technique is regarding the PV panel’s surface and its self-cleaning effect, which additionally acts as a booster to the average delivered electricity.

  10. Performance values for non destructive assay (NDA) techniques applied to safeguards: the 2002 evaluation by the ESARDA NDA Working Group

    International Nuclear Information System (INIS)

    Guardini, S.

    2003-01-01

    The first evaluation of NDA performance values undertaken by the ESARDA Working Group for Standards and Non Destructive Assay Techniques (WGNDA) was published in 1993. Almost 10 years later the Working Group decided to review those values, to report about improvements and to issue new performance values for techniques which were not applied in the early nineties, or were at that time only emerging. Non-Destructive Assay techniques have become more and more important in recent years, and they are used to a large extent in nuclear material accountancy and control both by operators and control authorities. As a consequence, the performance evaluation for NDA techniques is of particular relevance to safeguards authorities in optimising Safeguards operations and reducing costs. Performance values are important also for NMAC regulators, to define detection levels, limits for anomalies, goal quantities and to negotiate basic audit rules. This paper presents the latest evaluation of ESARDA Performance Values (EPVs) for the most common NDA techniques currently used for the assay of nuclear materials for Safeguards purposes. The main topics covered by the document are: techniques for plutonium bearing materials: PuO 2 and MOX; techniques for U-bearing materials; techniques for U and Pu in liquid form; techniques for spent fuel assay. This issue of the performance values is the result of specific international round robin exercises, field measurements and ad hoc experiments, evaluated and discussed in the ESARDA NDA Working Group. (author)

  11. Synchrotron and Simulations Techniques Applied to Problems in Materials Science: Catalysts and Azul Maya Pigments

    International Nuclear Information System (INIS)

    Chianelli, R.

    2005-01-01

    Development of synchrotron techniques for the determination of the structure of disordered, amorphous and surface materials has exploded over the past twenty years due to the increasing availability of high flux synchrotron radiation and the continuing development of increasingly powerful synchrotron techniques. These techniques are available to materials scientists who are not necessarily synchrotron scientists through interaction with effective user communities that exist at synchrotrons such as the Stanford Synchrotron Radiation Laboratory (SSRL). In this article we review the application of multiple synchrotron characterization techniques to two classes of materials defined as ''surface compounds.'' One class of surface compounds are materials like MoS 2-x C x that are widely used petroleum catalysts used to improve the environmental properties of transportation fuels. These compounds may be viewed as ''sulfide supported carbides'' in their catalytically active states. The second class of ''surface compounds'' is the ''Maya Blue'' pigments that are based on technology created by the ancient Maya. These compounds are organic/inorganic ''surface complexes'' consisting of the dye indigo and palygorskite, a common clay. The identification of both surface compounds relies on the application of synchrotron techniques as described in this report

  12. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression

    Directory of Open Access Journals (Sweden)

    Land Walker H

    2011-01-01

    Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  13. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1980-01-01

    A fault tree analysis package is described that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage, and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and projects delays. The package operates interactively allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis system data can be derived automatically from a generic data bank. As the analysis procedes improved estimates of critical failure rates and test and maintenance schedules can be inserted. The computations are standard, - identification of minimal cut-sets, estimation of reliability parameters, and ranking of the effect of the individual component failure modes and system failure modes on these parameters. The user can vary the fault trees and data on-line, and print selected data for preferred systems in a form suitable for inclusion in safety reports. A case history is given - that of HIFAR containment isolation system. (author)

  14. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  15. Feasibility to apply the steam assisted gravity drainage (SAGD) technique in the country's heavy crude-oil fields

    International Nuclear Information System (INIS)

    Rodriguez, Edwin; Orjuela, Jaime

    2004-01-01

    The steam assisted gravity drainage (SAGD) processes are one of the most efficient and profitable technologies for the production of heavy crude oils and oil sands. These processes involve the drilling of a couple of parallel horizontal wells, separated by a vertical distance and located near the oil field base. The upper well is used to continuously inject steam into the zone of interest, while the lower well collects all resulting fluids (oil, condensate and formation water) and takes them to the surface (Butler, 1994). This technology has been successfully implemented in countries such as Canada, Venezuela and United States, reaching recovery factors in excess of 50%. This article provides an overview of the technique's operation mechanism and the process most relevant characteristics, as well as the various categories this technology is divided into, including all its advantages and limitations. Furthermore, the article sets the oil field's minimal conditions under which the SAGD process is efficient, which conditions, as integrated to a series of mathematical models, allow to make forecasts on production, thermal efficiency (ODR) and oil to be recovered, as long as it is feasible (from a technical point of view) to apply this technique to a defined oil field. The information and concepts compiled during this research prompted the development of software, which may be used as an information, analysis and interpretation tool to predict and quantify this technology's performance. Based on the article, preliminary studies were started for the country's heavy crude-oil fields, identifying which provide the minimum conditions for the successful development of a pilot project

  16. Schlieren technique applied to the arc temperature measurement in a high energy density cutting torch

    International Nuclear Information System (INIS)

    Prevosto, L.; Mancinelli, B.; Artana, G.; Kelly, H.

    2010-01-01

    Plasma temperature and radial density profiles of the plasma species in a high energy density cutting arc have been obtained by using a quantitative schlieren technique. A Z-type two-mirror schlieren system was used in this research. Due to its great sensibility such technique allows measuring plasma composition and temperature from the arc axis to the surrounding medium by processing the gray-level contrast values of digital schlieren images recorded at the observation plane for a given position of a transverse knife located at the exit focal plane of the system. The technique has provided a good visualization of the plasma flow emerging from the nozzle and its interactions with the surrounding medium and the anode. The obtained temperature values are in good agreement with those values previously obtained by the authors on the same torch using Langmuir probes.

  17. Applied techniques for high bandwidth data transfers across wide area networks

    International Nuclear Information System (INIS)

    Lee, Jason; Gunter, Dan; Tierney, Brian; Allcock, Bill; Bester, Joe; Bresnahan, John; Tuecke, Steve

    2001-01-01

    Large distributed systems such as Computational/Data Grids require large amounts of data to be co-located with the computing facilities for processing. Ensuring that the data is there in time for the computation in today's Internet is a massive problem. From our work developing a scalable distributed network cache, we have gained experience with techniques necessary to achieve high data throughput over high bandwidth Wide Area Networks (WAN). In this paper, we discuss several hardware and software design techniques and issues, and then describe their application to an implementation of an enhanced FTP protocol called GridFTP. We also describe results from two applications using these techniques, which were obtained at the Supercomputing 2000 conference

  18. The Evidence-Based Practice of Applied Behavior Analysis.

    Science.gov (United States)

    Slocum, Timothy A; Detrich, Ronnie; Wilczynski, Susan M; Spencer, Trina D; Lewis, Teri; Wolfe, Katie

    2014-05-01

    Evidence-based practice (EBP) is a model of professional decision-making in which practitioners integrate the best available evidence with client values/context and clinical expertise in order to provide services for their clients. This framework provides behavior analysts with a structure for pervasive use of the best available evidence in the complex settings in which they work. This structure recognizes the need for clear and explicit understanding of the strength of evidence supporting intervention options, the important contextual factors including client values that contribute to decision making, and the key role of clinical expertise in the conceptualization, intervention, and evaluation of cases. Opening the discussion of EBP in this journal, Smith (The Behavior Analyst, 36, 7-33, 2013) raised several key issues related to EBP and applied behavior analysis (ABA). The purpose of this paper is to respond to Smith's arguments and extend the discussion of the relevant issues. Although we support many of Smith's (The Behavior Analyst, 36, 7-33, 2013) points, we contend that Smith's definition of EBP is significantly narrower than definitions that are used in professions with long histories of EBP and that this narrowness conflicts with the principles that drive applied behavior analytic practice. We offer a definition and framework for EBP that aligns with the foundations of ABA and is consistent with well-established definitions of EBP in medicine, psychology, and other professions. In addition to supporting the systematic use of research evidence in behavior analytic decision making, this definition can promote clear communication about treatment decisions across disciplines and with important outside institutions such as insurance companies and granting agencies.

  19. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  20. Applied predictive analytics principles and techniques for the professional data analyst

    CERN Document Server

    Abbott, Dean

    2014-01-01

    Learn the art and science of predictive analytics - techniques that get results Predictive analytics is what translates big data into meaningful, usable business information. Written by a leading expert in the field, this guide examines the science of the underlying algorithms as well as the principles and best practices that govern the art of predictive analytics. It clearly explains the theory behind predictive analytics, teaches the methods, principles, and techniques for conducting predictive analytics projects, and offers tips and tricks that are essential for successful predictive mode

  1. Applied techniques for high bandwidth data transfers across wide area networks

    International Nuclear Information System (INIS)

    Lee, J.; Gunter, D.; Tierney, B.; Allcock, B.; Bester, J.; Bresnahan, J.; Tuecke, S.

    2001-01-01

    Large distributed systems such as Computational/Data Grids require large amounts of data to be co-located with the computing facilities for processing. From their work developing a scalable distributed network cache, the authors have gained experience with techniques necessary to achieve high data throughput over high bandwidth Wide Area Networks (WAN). The authors discuss several hardware and software design techniques, and then describe their application to an implementation of an enhanced FTP protocol called GridFTP. The authors describe results from the Supercomputing 2000 conference

  2. People Recognition for Loja ECU911 applying artificial vision techniques

    Directory of Open Access Journals (Sweden)

    Diego Cale

    2016-05-01

    Full Text Available This article presents a technological proposal based on artificial vision which aims to search people in an intelligent way by using IP video cameras. Currently, manual searching process is time and resource demanding in contrast to automated searching one, which means that it could be replaced. In order to obtain optimal results, three different techniques of artificial vision were analyzed (Eigenfaces, Fisherfaces, Local Binary Patterns Histograms. The selection process considered factors like lighting changes, image quality and changes in the angle of focus of the camera. Besides, a literature review was conducted to evaluate several points of view regarding artificial vision techniques.

  3. U P1, an example for advanced techniques applied to high level activity dismantling

    International Nuclear Information System (INIS)

    Michel-Noel, M.; Calixte, O.; Blanchard, S.; Bani, J.; Girones, P.; Moitrier, C.; Terry, G.; Bourdy, R.

    2014-01-01

    The U P1 plant on the CEA Marcoule site was dedicated to the processing of spend fuels from the G1, G2 and G3 plutonium-producing reactors. This plant represents 20.000 m 2 of workshops housing about 1000 hot cells. In 1998, a huge program for the dismantling and cleaning-up of the UP1 plant was launched. CEA has developed new techniques to face the complexity of the dismantling operations. These techniques include immersive virtual reality, laser cutting, a specific manipulator arm called MAESTRO and remote handling. (A.C.)

  4. Nucelar reactor seismic safety analysis techniques

    International Nuclear Information System (INIS)

    Cummings, G.E.; Wells, J.E.; Lewis, L.C.

    1979-04-01

    In order to provide insights into the seismic safety requirements for nuclear power plants, a probabilistic based systems model and computational procedure have been developed. This model and computational procedure will be used to identify where data and modeling uncertainties need to be decreased by studying the effect of these uncertainties on the probability of radioactive release and the probability of failure of various structures, systems, and components. From the estimates of failure and release probabilities and their uncertainties the most sensitive steps in the seismic methodologies can be identified. In addition, the procedure will measure the uncertainty due to random occurrences, e.g. seismic event probabilities, material property variability, etc. The paper discusses the elements of this systems model and computational procedure, the event-tree/fault-tree development, and the statistical techniques to be employed

  5. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  6. Effect of the reinforcement bar arrangement on the efficiency of electrochemical chloride removal technique applied to reinforced concrete structures

    International Nuclear Information System (INIS)

    Garces, P.; Sanchez de Rojas, M.J.; Climent, M.A.

    2006-01-01

    This paper reports on the research done to find out the effect that different bar arrangements may have on the efficiency of the electrochemical chloride removal (ECR) technique when applied to a reinforced concrete structural member. Five different types of bar arrangements were considered, corresponding to typical structural members such as columns (with single and double bar reinforcing), slabs, beams and footings. ECR was applied in several steps. We observe that the extraction efficiency depends on the reinforcing bar arrangement. A uniform layer set-up favours chloride extraction. Electrochemical techniques were also used to estimate the reinforcing bar corrosion states, as well as measure the corrosion potential, and instant corrosion rate based on the polarization resistance technique. After ECR treatment, a reduction in the corrosion levels is observed falling short of the depassivation threshold

  7. Digital filtering techniques applied to electric power systems protection; Tecnicas de filtragem digital aplicadas a protecao de sistemas eletricos de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Brito, Helio Glauco Ferreira

    1996-12-31

    This work introduces an analysis and a comparative study of some of the techniques for digital filtering of the voltage and current waveforms from faulted transmission lines. This study is of fundamental importance for the development of algorithms applied to digital protection of electric power systems. The techniques studied are based on the Discrete Fourier Transform theory, the Walsh functions and the Kalman filter theory. Two aspects were emphasized in this study: Firstly, the non-recursive techniques were analysed with the implementation of filters based on Fourier theory and the Walsh functions. Secondly, recursive techniques were analyzed, with the implementation of the filters based on the Kalman theory and once more on the Fourier theory. (author) 56 refs., 25 figs., 16 tabs.

  8. Decentralized control using compositional analysis techniques

    NARCIS (Netherlands)

    Kerber, F.; van der Schaft, A. J.

    2011-01-01

    Decentralized control strategies aim at achieving a global control target by means of distributed local controllers acting on individual subsystems of the overall plant. In this sense, decentralized control is a dual problem to compositional analysis where a global verification task is decomposed

  9. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan

    2016-01-01

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social

  10. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  11. 3D-QSPR Method of Computational Technique Applied on Red Reactive Dyes by Using CoMFA Strategy

    Directory of Open Access Journals (Sweden)

    Shahnaz Perveen

    2011-12-01

    Full Text Available Cellulose fiber is a tremendous natural resource that has broad application in various productions including the textile industry. The dyes, which are commonly used for cellulose printing, are “reactive dyes” because of their high wet fastness and brilliant colors. The interaction of various dyes with the cellulose fiber depends upon the physiochemical properties that are governed by specific features of the dye molecule. The binding pattern of the reactive dye with cellulose fiber is called the ligand-receptor concept. In the current study, the three dimensional quantitative structure property relationship (3D-QSPR technique was applied to understand the red reactive dyes interactions with the cellulose by the Comparative Molecular Field Analysis (CoMFA method. This method was successfully utilized to predict a reliable model. The predicted model gives satisfactory statistical results and in the light of these, it was further analyzed. Additionally, the graphical outcomes (contour maps help us to understand the modification pattern and to correlate the structural changes with respect to the absorptivity. Furthermore, the final selected model has potential to assist in understanding the charachteristics of the external test set. The study could be helpful to design new reactive dyes with better affinity and selectivity for the cellulose fiber.

  12. 3D-QSPR method of computational technique applied on red reactive dyes by using CoMFA strategy.

    Science.gov (United States)

    Mahmood, Uzma; Rashid, Sitara; Ali, S Ishrat; Parveen, Rasheeda; Zaheer-Ul-Haq; Ambreen, Nida; Khan, Khalid Mohammed; Perveen, Shahnaz; Voelter, Wolfgang

    2011-01-01

    Cellulose fiber is a tremendous natural resource that has broad application in various productions including the textile industry. The dyes, which are commonly used for cellulose printing, are "reactive dyes" because of their high wet fastness and brilliant colors. The interaction of various dyes with the cellulose fiber depends upon the physiochemical properties that are governed by specific features of the dye molecule. The binding pattern of the reactive dye with cellulose fiber is called the ligand-receptor concept. In the current study, the three dimensional quantitative structure property relationship (3D-QSPR) technique was applied to understand the red reactive dyes interactions with the cellulose by the Comparative Molecular Field Analysis (CoMFA) method. This method was successfully utilized to predict a reliable model. The predicted model gives satisfactory statistical results and in the light of these, it was further analyzed. Additionally, the graphical outcomes (contour maps) help us to understand the modification pattern and to correlate the structural changes with respect to the absorptivity. Furthermore, the final selected model has potential to assist in understanding the characteristics of the external test set. The study could be helpful to design new reactive dyes with better affinity and selectivity for the cellulose fiber.

  13. UQ and V&V techniques applied to experiments and simulations of heated pipes pressurized to failure

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dempsey, J. Franklin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Antoun, Bonnie R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    This report demonstrates versatile and practical model validation and uncertainty quantification techniques applied to the accuracy assessment of a computational model of heated steel pipes pressurized to failure. The Real Space validation methodology segregates aleatory and epistemic uncertainties to form straightforward model validation metrics especially suited for assessing models to be used in the analysis of performance and safety margins. The methodology handles difficulties associated with representing and propagating interval and/or probabilistic uncertainties from multiple correlated and uncorrelated sources in the experiments and simulations including: material variability characterized by non-parametric random functions (discrete temperature dependent stress-strain curves); very limited (sparse) experimental data at the coupon testing level for material characterization and at the pipe-test validation level; boundary condition reconstruction uncertainties from spatially sparse sensor data; normalization of pipe experimental responses for measured input-condition differences among tests and for random and systematic uncertainties in measurement/processing/inference of experimental inputs and outputs; numerical solution uncertainty from model discretization and solver effects.

  14. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  15. Comparative assessment of PIV-based pressure evaluation techniques applied to a transonic base flow

    NARCIS (Netherlands)

    Blinde, P; Michaelis, D; van Oudheusden, B.W.; Weiss, P.E.; de Kat, R.; Laskari, A.; Jeon, Y.J.; David, L; Schanz, D; Huhn, F.; Gesemann, S; Novara, M.; McPhaden, C.; Neeteson, N.; Rival, D.; Schneiders, J.F.G.; Schrijer, F.F.J.

    2016-01-01

    A test case for PIV-based pressure evaluation techniques has been developed by constructing a simulated experiment from a ZDES simulation for an axisymmetric base flow at Mach 0.7. The test case comprises sequences of four subsequent particle images (representing multi-pulse data) as well as

  16. Practising What We Teach: Vocational Teachers Learn to Research through Applying Action Learning Techniques

    Science.gov (United States)

    Lasky, Barbara; Tempone, Irene

    2004-01-01

    Action learning techniques are well suited to the teaching of organisation behaviour students because of their flexibility, inclusiveness, openness, and respect for individuals. They are no less useful as a tool for change for vocational teachers, learning, of necessity, to become researchers. Whereas traditional universities have always had a…

  17. Urban field guide: applying social forestry observation techniques to the east coast megalopolis

    Science.gov (United States)

    E. Svendsen; V. Marshall; M.F. Ufer

    2006-01-01

    A changing economy and different lifestyles have altered the meaning of the forest in the northeastern United States, prompting scientists to reconsider the spatial form, stewardship and function of the urban forest. The Authors describe how social observation techniques and the employment of a novel, locally based, participatory hand-held monitoring system could aid...

  18. Space-mapping techniques applied to the optimization of a safety isolating transformer

    NARCIS (Netherlands)

    T.V. Tran; S. Brisset; D. Echeverria (David); D.J.P. Lahaye (Domenico); P. Brochet

    2007-01-01

    textabstractSpace-mapping optimization techniques allow to allign low-fidelity and high-fidelity models in order to reduce the computational time and increase the accuracy of the solution. The main idea is to build an approximate model from the difference of response between both models. Therefore

  19. MSC/NASTRAN ''expert'' techniques developed and applied to the TFTR poloidal field coils

    International Nuclear Information System (INIS)

    O'Toole, J.A.

    1986-01-01

    The TFTR poloidal field (PF) coils are being analyzed by PPPL and Grumman using MSC/NASTRAN as a part of an overall effort to establish the absolute limiting conditions of operation for TFTR. Each of the PF coils will be analyzed in depth, using a detailed set of finite element models. Several of the models developed are quite large because each copper turn, as well as its surrounding insulation, was modeled using solid elements. Several of the finite element models proved large enough to tax the capabilities of the National Magnetic Fusion Energy Computer Center (NMFECC), specifically disk storage space. To allow the use of substructuring techniques with their associated data bases for the larger models, it became necessary to employ certain infrequently used MSC/NASTRAN ''expert'' techniques. The techniques developed used multiple data bases and data base sets to divide each problem into a series of computer runs. For each run, only the data required was kept on active disk space, the remainder being placed in inactive ''FILEM'' storage, thus, minimizing active disk space required at any time and permitting problem solution using the NMFECC. A representative problem using the TFTR OH-1 coil global model provides an example of the techniques developed. The special considerations necessary to obtain proper results are discussed

  20. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    Science.gov (United States)

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  1. Common cause evaluations in applied risk analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system

  2. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  3. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  4. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  5. Spectral deformation techniques applied to the study of quantum statistical irreversible processes

    International Nuclear Information System (INIS)

    Courbage, M.

    1978-01-01

    A procedure of analytic continuation of the resolvent of Liouville operators for quantum statistical systems is discussed. When applied to the theory of irreversible processes of the Brussels School, this method supports the idea that the restriction to a class of initial conditions is necessary to obtain an irreversible behaviour. The general results are tested on the Friedrichs model. (Auth.)

  6. Do trained practice nurses apply motivational interviewing techniques in primary care consultations?

    NARCIS (Netherlands)

    Noordman, J.; Lee, I. van der; Nielen, M.; Vlek, H.; Weijden, T. van der; Dulmen, S. van

    2012-01-01

    Background: Reducing the prevalence of unhealthy lifestyle behaviour could positively influence health. Motivational interviewing (MI) is used to promote change in unhealthy lifestyle behaviour as part of primary or secondary prevention. Whether MI is actually applied as taught is unknown. Practice

  7. New enhanced sensitivity infrared laser spectroscopy techniques applied to reactive plasmas and trace gas detection

    NARCIS (Netherlands)

    Welzel, S.

    2009-01-01

    Infrared laser absorption spectroscopy (IRLAS) employing both tuneable diode and quantum cascade lasers (TDLs, QCLs) has been applied with both high sensitivity and high time resolution to plasma diagnostics and trace gas measurements. TDLAS combined with a conventional White type multiple pass cell

  8. Painleve singularity analysis applied to charged particle dynamics during reconnection

    International Nuclear Information System (INIS)

    Larson, J.W.

    1992-01-01

    For a plasma in the collisionless regime, test-particle modelling can lend some insight into the macroscopic behavior of the plasma, e.g. conductivity and heating. A common example for which this technique is used is a system with electric and magnetic fields given by B = δyx + zy + yz and E = εz, where δ, γ, and ε are constant parameters. This model can be used to model plasma behavior near neutral lines, (γ = 0), as well as current sheets (γ = 0, δ = 0). The integrability properties of the particle motion in such fields might affect the plasma's macroscopic behavior, and the author has asked the question open-quotes For what values of δ, γ, and ε is the system integrable?close quotes To answer this question, the author has employed Painleve singularity analysis, which is an examination of the singularity properties of a test particle's equations of motion in the complex time plane. This analysis has identified two field geometries for which the system's particle dynamics are integrable in terms of the second Painleve transcendent: the circular O-line case and the case of the neutral sheet configuration. These geometries yield particle dynamics that are integrable in the Liouville sense (i.e., there exist the proper number of integrals in involution) in an extended phase space which includes the time as a canonical coordinate, and this property is also true for nonzero γ. The singularity property tests also identified a large, dense set of X-line and O-line field geometries that yield dynamics that may possess the weak Painleve property. In the case of the X-line geometries, this result shows little relevance to the physical nature of the system, but the existence of a dense set of elliptical O-line geometries with this property may be related to the fact that for ε positive, one can construct asymptotic solutions in the limit t → ∞

  9. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  10. Application of activation techniques to biological analysis

    International Nuclear Information System (INIS)

    Bowen, H.J.M.

    1981-01-01

    Applications of activation analysis in the biological sciences are reviewed for the period of 1970 to 1979. The stages and characteristics of activation analysis are described, and its advantages and disadvantages enumerated. Most applications involve activation by thermal neutrons followed by either radiochemical or instrumental determination. Relatively little use has been made of activation by fast neutrons, photons, or charged particles. In vivo analyses are included, but those based on prompt gamma or x-ray emission are not. Major applications include studies of reference materials, and the elemental analysis of plants, marine biota, animal and human tissues, diets, and excreta. Relatively little use of it has been made in biochemistry, microbiology, and entomology, but it has become important in toxicology and environmental science. The elements most often determined are Ag, As, Au, Br, Ca, Cd, Cl, Co, Cr, Cs, Cu, Fe, Hg, I, K, Mn, Mo, Na, Rb, Sb, Sc, Se, and Zn, while few or no determinations of B, Be, Bi, Ga, Gd, Ge, H, In, Ir, Li, Nd, Os, Pd, Pr, Pt, Re, Rh, Ru, Te, Tl, or Y have been made in biological materials

  11. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Science.gov (United States)

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  12. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Directory of Open Access Journals (Sweden)

    Ted von Hippel

    Full Text Available We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  13. Applied genre analysis: a multi-perspective model

    Directory of Open Access Journals (Sweden)

    Vijay K Bhatia

    2002-04-01

    Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

  14. Applying multicriteria analysis for choosing the best marination for pork

    Directory of Open Access Journals (Sweden)

    Nieto VMOS

    2015-01-01

    Full Text Available Objective. This research aimed to choose a best marination solution using the Analytic Hierarchy Process (AHP. Materials and methods. Pork meat samples were collected in a commercial slaughterhouse, and they were randomly distributed in four treatments with three different salt contents blend. Color, pH, retention of the solution, exudate and cooking loss, shear force and sensory attributes were assessed and evaluated. Multicriteria analysis using AHP was applied to the results in order to choose the best overall marination solution. Criteria used for selection were the physical and sensory characteristics of meat, and based on these criteria were classified solutions marination. Results. Results showed that the combination of the salts was the best alternative (Na2CO3+NaCl+Na5P3O10, followed by the solutions of (Na2CO3 + NaCl, and (Na5P3O10 + NaCl. Conclusions. All tested solutions with the salts used alone or in combination led to better physical and sensory attributes than the meat not marinated.

  15. Topological data analysis (TDA) applied to reveal pedogenetic principles of European topsoil system.

    Science.gov (United States)

    Savic, Aleksandar; Toth, Gergely; Duponchel, Ludovic

    2017-05-15

    Recent developments in applied mathematics are bringing new tools that are capable to synthesize knowledge in various disciplines, and help in finding hidden relationships between variables. One such technique is topological data analysis (TDA), a fusion of classical exploration techniques such as principal component analysis (PCA), and a topological point of view applied to clustering of results. Various phenomena have already received new interpretations thanks to TDA, from the proper choice of sport teams to cancer treatments. For the first time, this technique has been applied in soil science, to show the interaction between physical and chemical soil attributes and main soil-forming factors, such as climate and land use. The topsoil data set of the Land Use/Land Cover Area Frame survey (LUCAS) was used as a comprehensive database that consists of approximately 20,000 samples, each described by 12 physical and chemical parameters. After the application of TDA, results obtained were cross-checked against known grouping parameters including five types of land cover, nine types of climate and the organic carbon content of soil. Some of the grouping characteristics observed using standard approaches were confirmed by TDA (e.g., organic carbon content) but novel subtle relationships (e.g., magnitude of anthropogenic effect in soil formation), were discovered as well. The importance of this finding is that TDA is a unique mathematical technique capable of extracting complex relations hidden in soil science data sets, giving the opportunity to see the influence of physicochemical, biotic and abiotic factors on topsoil formation through fresh eyes. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A new standardless quantitative electron probe microanalysis technique applied to III-V compound semiconductors

    International Nuclear Information System (INIS)

    Zangalis, K.P.; Christou, A.

    1982-01-01

    The present paper introduces a new standardless quantitative scheme for off-line electron microprobe analysis applications. The analysis is based on standard equations of the type Isub(i)=Csub(i)fsub(ZAF)βsub(i) and is specifically suitable for compound semiconductors. The roots to the resultant nth-degree polynomial are the unknown concentrations. Methods for computing Csub(i) when coefficients βsub(i) are unknown are also outlined. Applications of standardless analysis to GaAs and InP specimens are compared with results obtained by Auger electron spectroscopy and quantitative electron probe analysis with standards. (Auth.)

  17. Parameters and definitions in applied technique quality test for nuclear magnetic resonance imaging system (NMRI)

    International Nuclear Information System (INIS)

    Lin Zhikai; Zhao Lancai

    1999-08-01

    During the past two decades, medical diagnostic imaging technique has achieved dramatic development such as CT, MRI, PET, DSA and so on. The most striking examples of them are the application of X ray computerized tomography (CT) and magnetic resonance imaging in the field of medical diagnosis. It can be predicted that magnetic resonance imaging (MRI) will definitely have more widespread prospects of applications and play more and more important role in clinical diagnosis looking forward to the development of image diagnostic technique for 21 st century. The authors also present the measuring methods for some parameters. The parameters described can be used for reference by clinical diagnosticians, operators on MRI and medical physicists who engages in image quality assurance (QA) and control (QC) in performing MRI acceptance test and routine test

  18. Applying machine learning techniques for forecasting flexibility of virtual power plants

    DEFF Research Database (Denmark)

    MacDougall, Pamela; Kosek, Anna Magdalena; Bindner, Henrik W.

    2016-01-01

    network as well as the multi-variant linear regression. It is found that it is possible to estimate the longevity of flexibility with machine learning. The linear regression algorithm is, on average, able to estimate the longevity with a 15% error. However, there was a significant improvement with the ANN...... approach to investigating the longevity of aggregated response of a virtual power plant using historic bidding and aggregated behaviour with machine learning techniques. The two supervised machine learning techniques investigated and compared in this paper are, multivariate linear regression and single...... algorithm achieving, on average, a 5.3% error. This is lowered 2.4% when learning for the same virtual power plant. With this information it would be possible to accurately offer residential VPP flexibility for market operations to safely avoid causing further imbalances and financial penalties....

  19. Geologic-radiometric techniques applied for uranium prospection at the Hierro-Cayo Largo area

    International Nuclear Information System (INIS)

    Gongora, L.E.; Olivera, J.

    1995-01-01

    Using geologic-radiometric techniques uraniferous anomalies were evaluated at the Hierro-Cayo Largo area located in Pinar del Rio province. During the uranium prospection works at most promising areas, geologic itineraries and gamma ray, radon emanation spectrometric survey were done. Trenches were made and some boreholes were drilled (up to 20-30 m depth). In addition a lot of samples were taken in order to determine the amount of U, Ra, Th, y K by spectrometric techniques. As result of this investigation, a geological placing, a mineralogical and geochemical characteristic of uraniferous mineralization was possible to find out. The appropriate prospection indications for uranium exploration at Esperanza geologic zone were defined

  20. Magnetic Resonance Techniques Applied to the Diagnosis and Treatment of Parkinson’s Disease

    Science.gov (United States)

    de Celis Alonso, Benito; Hidalgo-Tobón, Silvia S.; Menéndez-González, Manuel; Salas-Pacheco, José; Arias-Carrión, Oscar

    2015-01-01

    Parkinson’s disease (PD) affects at least 10 million people worldwide. It is a neurodegenerative disease, which is currently diagnosed by neurological examination. No neuroimaging investigation or blood biomarker is available to aid diagnosis and prognosis. Most effort toward diagnosis using magnetic resonance (MR) has been focused on the use of structural/anatomical neuroimaging and diffusion tensor imaging (DTI). However, deep brain stimulation, a current strategy for treating PD, is guided by MR imaging (MRI). For clinical prognosis, diagnosis, and follow-up investigations, blood oxygen level-dependent MRI, DTI, spectroscopy, and transcranial magnetic stimulation have been used. These techniques represent the state of the art in the last 5 years. Here, we focus on MR techniques for the diagnosis and treatment of Parkinson’s disease. PMID:26191037

  1. The Ecological Profiles Technique applied to data from Lichtenburg, South Africa

    Directory of Open Access Journals (Sweden)

    J. W. Morris

    1974-12-01

    Full Text Available The method of ecological profiles and information shared between species and ecological variables, developed in France, is described for the first time in English. Preliminary results, using the technique on Bankenveld quadrat data from Lichtenburg, Western Transvaal, are given. It is concluded that the method has great potential value for the understanding of the autecology of South African species provided that the sampling method is appropriate.

  2. Computer vision techniques applied to the quality control of ceramic plates

    OpenAIRE

    Silveira, Joaquim; Ferreira, Manuel João Oliveira; Santos, Cristina; Martins, Teresa

    2009-01-01

    This paper presents a system, based on computer vision techniques, that detects and quantifies different types of defects in ceramic plates. It was developed in collaboration with the industrial ceramic sector and consequently it was focused on the defects that are considered more quality depreciating by the Portuguese industry. They are of three main types: cracks; granules and relief surface. For each type the development was specific as far as image processing techn...

  3. Applying Multi-Criteria Decision-Making Techniques to Prioritize Agility Drivers

    OpenAIRE

    Ahmad Jafarnejad; Sayyed Mohammad Reza Davoodi; Abolfazl Sherafat

    2013-01-01

    It seems that to recognize and classify the factors affecting organizational agility and need to specify the amount of their importance for the organization is essential to preserve survival and success in today's environment. This paper reviews the concept of agility and its division in the following indicators included the factors of motivations organizational agility that have been ranked in terms of level of importance and their influence by the techniques of MCDM. The inner complexity, s...

  4. Artificial intelligence techniques applied to hourly global irradiance estimation from satellite-derived cloud index

    Energy Technology Data Exchange (ETDEWEB)

    Zarzalejo, L.F.; Ramirez, L.; Polo, J. [DER-CIEMAT, Madrid (Spain). Renewable Energy Dept.

    2005-07-01

    Artificial intelligence techniques, such as fuzzy logic and neural networks, have been used for estimating hourly global radiation from satellite images. The models have been fitted to measured global irradiance data from 15 Spanish terrestrial stations. Both satellite imaging data and terrestrial information from the years 1994, 1995 and 1996 were used. The results of these artificial intelligence models were compared to a multivariate regression based upon Heliosat I model. A general better behaviour was observed for the artificial intelligence models. (author)

  5. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    Science.gov (United States)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  6. Artificial intelligence techniques applied to hourly global irradiance estimation from satellite-derived cloud index

    International Nuclear Information System (INIS)

    Zarzalejo, Luis F.; Ramirez, Lourdes; Polo, Jesus

    2005-01-01

    Artificial intelligence techniques, such as fuzzy logic and neural networks, have been used for estimating hourly global radiation from satellite images. The models have been fitted to measured global irradiance data from 15 Spanish terrestrial stations. Both satellite imaging data and terrestrial information from the years 1994, 1995 and 1996 were used. The results of these artificial intelligence models were compared to a multivariate regression based upon Heliosat I model. A general better behaviour was observed for the artificial intelligence models

  7. Applying the sterile insect technique to the control of insect pests

    International Nuclear Information System (INIS)

    LaChance, L.E.; Klassen, W.

    1991-01-01

    The sterile insect technique involves the mass-rearing of insects, which are sterilized by gamma rays from a 60 Co source before being released in a controlled fashion into nature. Matings between the sterile insects released and native insects produce no progeny, and so if enough of these matings occur the pest population can be controlled or even eradicated. A modification of the technique, especially suitable for the suppression of the moths and butterflies, is called the F, or inherited sterility method. In this, lower radiation doses are used such that the released males are only partially sterile (30-60%) and the females are fully sterile. When released males mate with native females some progeny are produced, but they are completely sterile. Thus, full expression of the sterility is delayed by one generation. This article describes the use of the sterile insect technique in controlling the screwworm fly, the tsetse fly, the medfly, the pink bollworm and the melon fly, and of the F 1 sterility method in the eradication of local gypsy moth infestations. 18 refs, 5 figs, 1 tab

  8. Acoustic Emission and Echo Signal Compensation Techniques Applied to an Ultrasonic Logging-While-Drilling Caliper.

    Science.gov (United States)

    Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong

    2017-06-10

    A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved.

  9. Acoustic Emission and Echo Signal Compensation Techniques Applied to an Ultrasonic Logging-While-Drilling Caliper

    Directory of Open Access Journals (Sweden)

    Yongchao Yao

    2017-06-01

    Full Text Available A logging-while-drilling (LWD caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM. The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD and metal-oxide-semiconductor field effect transistor (MOSFET is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved.

  10. Applied nuclear γ-resonance as fingerprint technique in geochemistry and mineralogy

    International Nuclear Information System (INIS)

    Constantinescu, S.

    2003-01-01

    The aim of the present paper is to evidence the new developments of one of the most refined technique, the nuclear γ resonance or the well-known Moessbauer effect, in the field of mineralogical and geo-chemical investigation. There are many Moessbauer studies on minerals, but the development, the new performance of the Moessbauer equipment and of the computers impose to review more profoundly and more thoroughly the information, which this non-destructive technique offers. This task became more and more pressingly because a lot of minerals contain in high proportion, the Moessbauer isotopes. Generally, the mineralogists, physicists and chemists hope to obtain more refined and complete information about the physics and chemistry synthesis aspects in solid state transformation of some natural and synthetic materials and also about the structural aspects, by these kind of techniques. On this line, the authors very shortly review the principal aspects of the Moessbauer spectroscopy and underline the most important information one can obtain from spectra. The recent results, which have been obtained on minerals extracted from Romanian geological deposits by the authors, will be discussed in detail in the second part of this article. (authors)

  11. An efficient permeability scaling-up technique applied to the discretized flow equations

    Energy Technology Data Exchange (ETDEWEB)

    Urgelli, D.; Ding, Yu [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    Grid-block permeability scaling-up for numerical reservoir simulations has been discussed for a long time in the literature. It is now recognized that a full permeability tensor is needed to get an accurate reservoir description at large scale. However, two major difficulties are encountered: (1) grid-block permeability cannot be properly defined because it depends on boundary conditions; (2) discretization of flow equations with a full permeability tensor is not straightforward and little work has been done on this subject. In this paper, we propose a new method, which allows us to get around both difficulties. As the two major problems are closely related, a global approach will preserve the accuracy. So, in the proposed method, the permeability up-scaling technique is integrated in the discretized numerical scheme for flow simulation. The permeability is scaled-up via the transmissibility term, in accordance with the fluid flow calculation in the numerical scheme. A finite-volume scheme is particularly studied, and the transmissibility scaling-up technique for this scheme is presented. Some numerical examples are tested for flow simulation. This new method is compared with some published numerical schemes for full permeability tensor discretization where the full permeability tensor is scaled-up through various techniques. Comparing the results with fine grid simulations shows that the new method is more accurate and more efficient.

  12. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  13. Applying Data-mining techniques to study drought periods in Spain

    Science.gov (United States)

    Belda, F.; Penades, M. C.

    2010-09-01

    Data-mining is a technique that it can be used to interact with large databases and to help in the discovery relations between parameters by extracting information from massive and multiple data archives. Drought affects many economic and social sectors, from agricultural to transportation, going through urban water deficit and the development of modern industries. With these problems and drought geographical and temporal distribution it's difficult to find a single definition of drought. Improving the understanding of the knowledge of climatic index is necessary to reduce the impacts of drought and to facilitate quick decisions regarding this problem. The main objective is to analyze drought periods from 1950 to 2009 in Spain. We use several kinds of information, different formats, sources and transmission mode. We use satellite-based Vegetation Index, dryness index for several temporal periods. We use daily and monthly precipitation and temperature data and soil moisture data from numerical weather model. We calculate mainly Standardized Precipitation Index (SPI) that it has been used amply in the bibliography. We use OLAP-Mining techniques to discovery of association rules between remote-sensing, numerical weather model and climatic index. Time series Data- Mining techniques organize data as a sequence of events, with each event having a time of recurrence, to cluster the data into groups of records or cluster with similar characteristics. Prior climatological classification is necessary if we want to study drought periods over all Spain.

  14. Air flow measurement techniques applied to noise reduction of a centrifugal blower

    Science.gov (United States)

    Laage, John W.; Armstrong, Ashli J.; Eilers, Daniel J.; Olsen, Michael G.; Mann, J. Adin

    2005-09-01

    The air flow in a centrifugal blower was studied using a variety of flow and sound measurement techniques. The flow measurement techniques employed included Particle Image Velocimetry (PIV), pitot tubes, and a five hole spherical probe. PIV was used to measure instantaneous and ensemble-averaged velocity fields over large area of the outlet duct as a function of fan position, allowing for the visualization of the flow as it leave the fan blades and progressed downstream. The results from the flow measurements were reviewed along side the results of the sound measurements with the goal of identifying sources of noise and inefficiencies in flow performance. The radiated sound power was divided into broadband and tone noise and measures of the flow. The changes in the tone and broadband sound were compared to changes in flow quantities such as the turbulent kinetic energy and Reynolds stress. Results for each method will be presented to demonstrate the strengths of each flow measurement technique as well as their limitations. Finally, the role that each played in identifying noise sources is described.

  15. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  16. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  17. A photoacoustic technique applied to detection of ethylene emissions in edible coated passion fruit

    International Nuclear Information System (INIS)

    Alves, G V L; Santos, W C dos; Vargas, H; Silva, M G da; Waldman, W R; Oliveira, J G

    2010-01-01

    Photoacoustic spectroscopy was applied to study the physiological behavior of passion fruit when coated with edible films. The results have shown a reduction of the ethylene emission rate. Weight loss monitoring has not shown any significant differences between the coated and uncoated passion fruit. On the other hand, slower color changes of coated samples suggest a slowdown of the ripening process in coated passion fruit.

  18. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    Science.gov (United States)

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  19. Hybrid multicore/vectorisation technique applied to the elastic wave equation on a staggered grid

    Science.gov (United States)

    Titarenko, Sofya; Hildyard, Mark

    2017-07-01

    In modern physics it has become common to find the solution of a problem by solving numerically a set of PDEs. Whether solving them on a finite difference grid or by a finite element approach, the main calculations are often applied to a stencil structure. In the last decade it has become usual to work with so called big data problems where calculations are very heavy and accelerators and modern architectures are widely used. Although CPU and GPU clusters are often used to solve such problems, parallelisation of any calculation ideally starts from a single processor optimisation. Unfortunately, it is impossible to vectorise a stencil structured loop with high level instructions. In this paper we suggest a new approach to rearranging the data structure which makes it possible to apply high level vectorisation instructions to a stencil loop and which results in significant acceleration. The suggested method allows further acceleration if shared memory APIs are used. We show the effectiveness of the method by applying it to an elastic wave propagation problem on a finite difference grid. We have chosen Intel architecture for the test problem and OpenMP (Open Multi-Processing) since they are extensively used in many applications.

  20. Assessment of ground-based monitoring techniques applied to landslide investigations

    Science.gov (United States)

    Uhlemann, S.; Smith, A.; Chambers, J.; Dixon, N.; Dijkstra, T.; Haslam, E.; Meldrum, P.; Merritt, A.; Gunn, D.; Mackay, J.

    2016-01-01

    A landslide complex in the Whitby Mudstone Formation at Hollin Hill, North Yorkshire, UK is periodically re-activated in response to rainfall-induced pore-water pressure fluctuations. This paper compares long-term measurements (i.e., 2009-2014) obtained from a combination of monitoring techniques that have been employed together for the first time on an active landslide. The results highlight the relative performance of the different techniques, and can provide guidance for researchers and practitioners for selecting and installing appropriate monitoring techniques to assess unstable slopes. Particular attention is given to the spatial and temporal resolutions offered by the different approaches that include: Real Time Kinematic-GPS (RTK-GPS) monitoring of a ground surface marker array, conventional inclinometers, Shape Acceleration Arrays (SAA), tilt meters, active waveguides with Acoustic Emission (AE) monitoring, and piezometers. High spatial resolution information has allowed locating areas of stability and instability across a large slope. This has enabled identification of areas where further monitoring efforts should be focused. High temporal resolution information allowed the capture of 'S'-shaped slope displacement-time behaviour (i.e. phases of slope acceleration, deceleration and stability) in response to elevations in pore-water pressures. This study shows that a well-balanced suite of monitoring techniques that provides high temporal and spatial resolutions on both measurement and slope scale is necessary to fully understand failure and movement mechanisms of slopes. In the case of the Hollin Hill landslide it enabled detailed interpretation of the geomorphological processes governing landslide activity. It highlights the benefit of regularly surveying a network of GPS markers to determine areas for installation of movement monitoring techniques that offer higher resolution both temporally and spatially. The small sensitivity of tilt meter measurements